Goal of the Competition¶

Welcome to the "Predictive Medicine in Bioinformatics" competition hosted by InVitro Cell Research, LLC (ICR). In this notebook, we will embark on a data science journey to develop a predictive model capable of detecting three medical conditions based on measurements of health characteristics.

Objective:

The primary objective of this competition is to predict whether a person has any of three medical conditions (Class 1) or none of the three medical conditions (Class 0). By utilizing predictive models, we aim to streamline the process of determining these medical conditions, which traditionally requires invasive and time-consuming data collection from patients. Our predictive model will leverage measurements of key health characteristics to make reliable and private predictions, thus potentially revolutionizing the field of bioinformatics.

Context:

Aging is a significant risk factor for various health issues, including heart disease, dementia, hearing loss, and arthritis. Bioinformatics, an emerging field, focuses on finding interventions to slow and reverse biological aging and prevent age-related ailments. In this data science competition, we will explore how data-driven approaches can contribute to solving critical problems in bioinformatics, even when dealing with small datasets.

About the Competition Host (InVitro Cell Research, LLC):

Founded in 2015, InVitro Cell Research, LLC (ICR) is a pioneering company dedicated to regenerative and preventive personalized medicine. Situated in the greater New York City area, ICR boasts state-of-the-art research facilities and a team of dedicated scientists who shape their mission of researching how to repair aging in people quickly.

Conclusion:

This competition is a thrilling opportunity to explore new methods for solving complex bioinformatics problems using diverse data. Your contributions can significantly impact the field and help researchers understand the relationship between health characteristics and potential patient conditions. Let's get started and develop an innovative and robust predictive model to improve the lives of countless individuals through personalized medicine!

In [140]:
# Importing required libraries:

import numpy as np
import pandas as pd

import seaborn as sns
from matplotlib.colors import ListedColormap, LinearSegmentedColormap
import matplotlib.pyplot as plt
from colorama import Style, Fore
import plotly.express as px
import plotly.io as pio

from sklearn.impute import SimpleImputer
from sklearn.preprocessing import LabelEncoder
from sklearn.preprocessing import StandardScaler

from sklearn.model_selection import train_test_split
import warnings
In [2]:
pip install missingno
Requirement already satisfied: missingno in c:\users\apara\anaconda3\lib\site-packages (0.5.2)
Requirement already satisfied: seaborn in c:\users\apara\anaconda3\lib\site-packages (from missingno) (0.12.2)
Requirement already satisfied: matplotlib in c:\users\apara\anaconda3\lib\site-packages (from missingno) (3.7.0)
Requirement already satisfied: scipy in c:\users\apara\anaconda3\lib\site-packages (from missingno) (1.7.3)
Requirement already satisfied: numpy in c:\users\apara\anaconda3\lib\site-packages (from missingno) (1.22.1)
Requirement already satisfied: cycler>=0.10 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->missingno) (0.11.0)
Requirement already satisfied: contourpy>=1.0.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->missingno) (1.0.5)
Requirement already satisfied: pillow>=6.2.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->missingno) (9.4.0)
Requirement already satisfied: pyparsing>=2.3.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->missingno) (3.0.9)
Requirement already satisfied: fonttools>=4.22.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->missingno) (4.25.0)
Requirement already satisfied: packaging>=20.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->missingno) (22.0)
Requirement already satisfied: kiwisolver>=1.0.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->missingno) (1.4.4)
Requirement already satisfied: python-dateutil>=2.7 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->missingno) (2.8.2)
Requirement already satisfied: pandas>=0.25 in c:\users\apara\anaconda3\lib\site-packages (from seaborn->missingno) (1.5.3)
Requirement already satisfied: pytz>=2020.1 in c:\users\apara\anaconda3\lib\site-packages (from pandas>=0.25->seaborn->missingno) (2022.7)
Requirement already satisfied: six>=1.5 in c:\users\apara\anaconda3\lib\site-packages (from python-dateutil>=2.7->matplotlib->missingno) (1.16.0)
Note: you may need to restart the kernel to use updated packages.
In [3]:
!pip install --upgrade threadpoolctl
!pip install --upgrade scikit-learn imbalanced-learn
Requirement already satisfied: threadpoolctl in c:\users\apara\anaconda3\lib\site-packages (3.2.0)
Requirement already satisfied: scikit-learn in c:\users\apara\anaconda3\lib\site-packages (1.3.0)
Requirement already satisfied: imbalanced-learn in c:\users\apara\anaconda3\lib\site-packages (0.11.0)
Requirement already satisfied: joblib>=1.1.1 in c:\users\apara\anaconda3\lib\site-packages (from scikit-learn) (1.1.1)
Requirement already satisfied: scipy>=1.5.0 in c:\users\apara\anaconda3\lib\site-packages (from scikit-learn) (1.7.3)
Requirement already satisfied: threadpoolctl>=2.0.0 in c:\users\apara\anaconda3\lib\site-packages (from scikit-learn) (3.2.0)
Requirement already satisfied: numpy>=1.17.3 in c:\users\apara\anaconda3\lib\site-packages (from scikit-learn) (1.22.1)
In [4]:
import missingno as msno
import json

pd.set_option('display.max_columns', None): This sets the maximum number of columns to display in pandas DataFrames to None, which means all columns will be displayed.

pd.set_option('display.max_rows', None): This sets the maximum number of rows to display in pandas DataFrames to None, which means all rows will be displayed.

warnings.filterwarnings("ignore"): This line suppresses warnings in your code. It filters out warning messages and prevents them from being displayed in the output. It can be useful if you want to hide warnings that may not be relevant or distracting for your specific use case.

In [5]:
# setting no limit on the number of columns & rows displayed

pd.set_option('display.max_columns', None)
pd.set_option('display.max_rows', None)

#ignoring unnecessary warnings
warnings.filterwarnings("ignore")
In [145]:
#importing all the datsets and assigning respective dataframes

train_df = pd.read_csv('train.csv')
test_df = pd.read_csv('test.csv')
greeks_df = pd.read_csv('greeks.csv')
In [146]:
#checking shape of each dataset

print('Shape of train dataset:', train_df.shape)
print('Shape of test dataset:', test_df.shape)
print('Shape of greeks dataset:', greeks_df.shape)
Shape of train dataset: (617, 58)
Shape of test dataset: (5, 57)
Shape of greeks dataset: (617, 6)

Exploratory Data Analysis: Greeks Dataset:¶

In [8]:
# Examining the greeks dataset

greeks_df.head()
Out[8]:
Id Alpha Beta Gamma Delta Epsilon
0 000ff2bfdfe9 B C G D 3/19/2019
1 007255e47698 A C M B Unknown
2 013f2bd269f5 A C M B Unknown
3 043ac50845d5 A C M B Unknown
4 044fb8a146ec D B F B 3/25/2020
In [9]:
# Statistical summary of string (object) data in train dataset  

greeks_df.describe(include='object').T
Out[9]:
count unique top freq
Id 617 617 000ff2bfdfe9 1
Alpha 617 4 A 509
Beta 617 3 C 407
Gamma 617 8 M 445
Delta 617 4 B 456
Epsilon 617 198 Unknown 144
In [10]:
# Checking datatypes & null values: greeks dataset

greeks_df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 617 entries, 0 to 616
Data columns (total 6 columns):
 #   Column   Non-Null Count  Dtype 
---  ------   --------------  ----- 
 0   Id       617 non-null    object
 1   Alpha    617 non-null    object
 2   Beta     617 non-null    object
 3   Gamma    617 non-null    object
 4   Delta    617 non-null    object
 5   Epsilon  617 non-null    object
dtypes: object(6)
memory usage: 29.0+ KB
In [11]:
# Replacing 'unknowns' as null values in greeks dataset

greeks_df = greeks_df.replace("Unknown", pd.NA)
In [12]:
greeks_df.head(8)
Out[12]:
Id Alpha Beta Gamma Delta Epsilon
0 000ff2bfdfe9 B C G D 3/19/2019
1 007255e47698 A C M B <NA>
2 013f2bd269f5 A C M B <NA>
3 043ac50845d5 A C M B <NA>
4 044fb8a146ec D B F B 3/25/2020
5 04517a3c90bd A C M B 10/1/2019
6 049232ca8356 A C M B 5/29/2019
7 057287f2da6d A C M B 4/24/2019
In [13]:
# Calculate missing values for each column (greeks dataset)

print(greeks_df.isnull().sum())
Id           0
Alpha        0
Beta         0
Gamma        0
Delta        0
Epsilon    144
dtype: int64
In [14]:
# Import datatime module to convert the date into number of days

from datetime import datetime
In [15]:
# Converting epsilon column to datatime format

# The errors='coerce' argument is used to handle any values that cannot be converted to datetime format.
# When encountering such values, the 'coerce' option replaces them with *NaT (Not a Time) values*,
# effectively representing missing or invalid dates.

greeks_df['Epsilon'] = pd.to_datetime(greeks_df['Epsilon'], errors='coerce')
In [16]:
greeks_df.head(8)
Out[16]:
Id Alpha Beta Gamma Delta Epsilon
0 000ff2bfdfe9 B C G D 2019-03-19
1 007255e47698 A C M B NaT
2 013f2bd269f5 A C M B NaT
3 043ac50845d5 A C M B NaT
4 044fb8a146ec D B F B 2020-03-25
5 04517a3c90bd A C M B 2019-10-01
6 049232ca8356 A C M B 2019-05-29
7 057287f2da6d A C M B 2019-04-24
In [17]:
# Filling the NaT values with current date

greeks_df['Epsilon'].fillna(pd.to_datetime('today'), inplace=True)
In [18]:
greeks_df.head(8)
Out[18]:
Id Alpha Beta Gamma Delta Epsilon
0 000ff2bfdfe9 B C G D 2019-03-19 00:00:00.000000
1 007255e47698 A C M B 2023-08-04 17:24:18.033964
2 013f2bd269f5 A C M B 2023-08-04 17:24:18.033964
3 043ac50845d5 A C M B 2023-08-04 17:24:18.033964
4 044fb8a146ec D B F B 2020-03-25 00:00:00.000000
5 04517a3c90bd A C M B 2019-10-01 00:00:00.000000
6 049232ca8356 A C M B 2019-05-29 00:00:00.000000
7 057287f2da6d A C M B 2019-04-24 00:00:00.000000
In [19]:
# New column specifying number of days w.r.t most recent date

greeks_df['Days_Epsilon'] = (pd.to_datetime('today') - greeks_df['Epsilon']).dt.days
In [20]:
greeks_df.head(8)
Out[20]:
Id Alpha Beta Gamma Delta Epsilon Days_Epsilon
0 000ff2bfdfe9 B C G D 2019-03-19 00:00:00.000000 1599
1 007255e47698 A C M B 2023-08-04 17:24:18.033964 0
2 013f2bd269f5 A C M B 2023-08-04 17:24:18.033964 0
3 043ac50845d5 A C M B 2023-08-04 17:24:18.033964 0
4 044fb8a146ec D B F B 2020-03-25 00:00:00.000000 1227
5 04517a3c90bd A C M B 2019-10-01 00:00:00.000000 1403
6 049232ca8356 A C M B 2019-05-29 00:00:00.000000 1528
7 057287f2da6d A C M B 2019-04-24 00:00:00.000000 1563
In [21]:
# Replacing the '0' values in Days_Epsilon' column with median value.

median_days = greeks_df.loc[greeks_df["Days_Epsilon"] != 0, "Days_Epsilon"].median()

greeks_df.loc[greeks_df["Days_Epsilon"] == 0, "Days_Epsilon"] = median_days
In [22]:
greeks_df.head(8)
Out[22]:
Id Alpha Beta Gamma Delta Epsilon Days_Epsilon
0 000ff2bfdfe9 B C G D 2019-03-19 00:00:00.000000 1599
1 007255e47698 A C M B 2023-08-04 17:24:18.033964 1467
2 013f2bd269f5 A C M B 2023-08-04 17:24:18.033964 1467
3 043ac50845d5 A C M B 2023-08-04 17:24:18.033964 1467
4 044fb8a146ec D B F B 2020-03-25 00:00:00.000000 1227
5 04517a3c90bd A C M B 2019-10-01 00:00:00.000000 1403
6 049232ca8356 A C M B 2019-05-29 00:00:00.000000 1528
7 057287f2da6d A C M B 2019-04-24 00:00:00.000000 1563
In [23]:
# Greeks dataset with selected columns (dropping epsilon column)

greeks_df = greeks_df[['Id', 'Alpha', 'Beta', 'Gamma', 'Delta', 'Days_Epsilon']]
greeks_df.head(8)
Out[23]:
Id Alpha Beta Gamma Delta Days_Epsilon
0 000ff2bfdfe9 B C G D 1599
1 007255e47698 A C M B 1467
2 013f2bd269f5 A C M B 1467
3 043ac50845d5 A C M B 1467
4 044fb8a146ec D B F B 1227
5 04517a3c90bd A C M B 1403
6 049232ca8356 A C M B 1528
7 057287f2da6d A C M B 1563
In [24]:
# Checking missing values for each column (greeks dataset)

print(greeks_df.isnull().sum())
Id              0
Alpha           0
Beta            0
Gamma           0
Delta           0
Days_Epsilon    0
dtype: int64
In [25]:
# Check for duplicate rows in the DataFrame (greeks dataset)
duplicates_g = greeks_df.duplicated()

# Count the number of duplicate rows
num_duplicates_g = duplicates_g.sum()

# Print the duplicate rows and the count
print("Duplicate Rows:")
print(f"Number of Duplicate Rows: {num_duplicates_g}")
Duplicate Rows:
Number of Duplicate Rows: 0
In [26]:
# Statistical summary of numerical data in train dataset

greeks_df.describe()
Out[26]:
Days_Epsilon
count 617.000000
mean 1444.727715
std 327.708609
min 1039.000000
25% 1192.000000
50% 1467.000000
75% 1563.000000
max 4096.000000
In [27]:
# One-hot encoding for the categorical coulmns for the Greeks dataset 
# (One-hot encoding because it has nominal variables without an inherent order)

one_hot_encoded_greeks_df = pd.get_dummies(greeks_df, columns=['Alpha', 'Beta', 'Gamma', 'Delta'])
one_hot_encoded_greeks_df.head(8)
Out[27]:
Id Days_Epsilon Alpha_A Alpha_B Alpha_D Alpha_G Beta_A Beta_B Beta_C Gamma_A Gamma_B Gamma_E Gamma_F Gamma_G Gamma_H Gamma_M Gamma_N Delta_A Delta_B Delta_C Delta_D
0 000ff2bfdfe9 1599 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1
1 007255e47698 1467 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
2 013f2bd269f5 1467 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
3 043ac50845d5 1467 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
4 044fb8a146ec 1227 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 1 0 0
5 04517a3c90bd 1403 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
6 049232ca8356 1528 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
7 057287f2da6d 1563 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
In [28]:
# Now greeks dataset has all the numerical columns except 'id'

greeks_df = one_hot_encoded_greeks_df
greeks_df.head(8)
Out[28]:
Id Days_Epsilon Alpha_A Alpha_B Alpha_D Alpha_G Beta_A Beta_B Beta_C Gamma_A Gamma_B Gamma_E Gamma_F Gamma_G Gamma_H Gamma_M Gamma_N Delta_A Delta_B Delta_C Delta_D
0 000ff2bfdfe9 1599 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 1
1 007255e47698 1467 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
2 013f2bd269f5 1467 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
3 043ac50845d5 1467 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
4 044fb8a146ec 1227 0 0 1 0 0 1 0 0 0 0 1 0 0 0 0 0 1 0 0
5 04517a3c90bd 1403 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
6 049232ca8356 1528 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0
7 057287f2da6d 1563 1 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 1 0 0

Exploratory Data Analysis: Training dataset¶

In [29]:
# Examining the train dataset

train_df.head()
Out[29]:
Id AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EJ EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL Class
0 000ff2bfdfe9 0.209377 3109.03329 85.200147 22.394407 8.138688 0.699861 0.025578 9.812214 5.555634 4126.58731 22.5984 175.638726 152.707705 823.928241 257.432377 47.223358 0.563481 23.387600 4.851915 0.023482 1.050225 0.069225 13.784111 1.302012 36.205956 69.08340 295.570575 0.23868 0.284232 89.245560 84.31664 29.657104 5.310690 1.74307 23.187704 7.294176 1.987283 1433.166750 0.949104 B 30.879420 78.526968 3.828384 13.394640 10.265073 9028.291921 3.583450 7.298162 1.73855 0.094822 11.339138 72.611063 2003.810319 22.136229 69.834944 0.120343 1
1 007255e47698 0.145282 978.76416 85.200147 36.968889 8.138688 3.632190 0.025578 13.517790 1.229900 5496.92824 19.4205 155.868030 14.754720 51.216883 257.432377 30.284345 0.484710 50.628208 6.085041 0.031442 1.113875 1.117800 28.310953 1.357182 37.476568 70.79836 178.553100 0.23868 0.363489 110.581815 75.74548 37.532000 0.005518 1.74307 17.222328 4.926396 0.858603 1111.287150 0.003042 A 109.125159 95.415086 52.260480 17.175984 0.296850 6785.003474 10.358927 0.173229 0.49706 0.568932 9.292698 72.611063 27981.562750 29.135430 32.131996 21.978000 0
2 013f2bd269f5 0.470030 2635.10654 85.200147 32.360553 8.138688 6.732840 0.025578 12.824570 1.229900 5135.78024 26.4825 128.988531 219.320160 482.141594 257.432377 32.563713 0.495852 85.955376 5.376488 0.036218 1.050225 0.700350 39.364743 1.009611 21.459644 70.81970 321.426625 0.23868 0.210441 120.056438 65.46984 28.053464 1.289739 1.74307 36.861352 7.813674 8.146651 1494.076488 0.377208 B 109.125159 78.526968 5.390628 224.207424 8.745201 8338.906181 11.626917 7.709560 0.97556 1.198821 37.077772 88.609437 13676.957810 28.022851 35.192676 0.196941 0
3 043ac50845d5 0.252107 3819.65177 120.201618 77.112203 8.138688 3.685344 0.025578 11.053708 1.229900 4169.67738 23.6577 237.282264 11.050410 661.518640 257.432377 15.201914 0.717882 88.159360 2.347652 0.029054 1.400300 0.636075 41.116960 0.722727 21.530392 47.27586 196.607985 0.23868 0.292431 139.824570 71.57120 24.354856 2.655345 1.74307 52.003884 7.386060 3.813326 15691.552180 0.614484 B 31.674357 78.526968 31.323372 59.301984 7.884336 10965.766040 14.852022 6.122162 0.49706 0.284466 18.529584 82.416803 2094.262452 39.948656 90.493248 0.155829 0
4 044fb8a146ec 0.380297 3733.04844 85.200147 14.103738 8.138688 3.942255 0.054810 3.396778 102.151980 5728.73412 24.0108 324.546318 149.717165 6074.859475 257.432377 82.213495 0.536467 72.644264 30.537722 0.025472 1.050225 0.693150 31.724726 0.827550 34.415360 74.06532 200.178160 0.23868 0.207708 97.920120 52.83888 26.019912 1.144902 1.74307 9.064856 7.350720 3.490846 1403.656300 0.164268 B 109.125159 91.994825 51.141336 29.102640 4.274640 16198.049590 13.666727 8.153058 48.50134 0.121914 16.408728 146.109943 8524.370502 45.381316 36.262628 0.096614 1
In [30]:
# Statistical summary of numerical data in train dataset

train_df.describe().T
Out[30]:
count mean std min 25% 50% 75% max
AB 617.0 0.477149 0.468388 0.081187 0.252107 0.354659 0.559763 6.161666
AF 617.0 3502.013221 2300.322717 192.593280 2197.345480 3120.318960 4361.637390 28688.187660
AH 617.0 118.624513 127.838950 85.200147 85.200147 85.200147 113.739540 1910.123198
AM 617.0 38.968552 69.728226 3.177522 12.270314 20.533110 39.139886 630.518230
AR 617.0 10.128242 10.518877 8.138688 8.138688 8.138688 8.138688 178.943634
AX 617.0 5.545576 2.551696 0.699861 4.128294 5.031912 6.431634 38.270880
AY 617.0 0.060320 0.416817 0.025578 0.025578 0.025578 0.036845 10.315851
AZ 617.0 10.566447 4.350645 3.396778 8.129580 10.461320 12.969516 38.971568
BC 617.0 8.053012 65.166943 1.229900 1.229900 1.229900 5.081244 1463.693448
BD 617.0 5350.388655 3021.326641 1693.624320 4155.702870 4997.960730 6035.885700 53060.599240
BN 617.0 21.419492 3.478278 9.886800 19.420500 21.186000 23.657700 29.307300
BP 617.0 231.322223 183.992505 72.948951 156.847239 193.908816 247.803462 2447.810550
BQ 557.0 98.328737 96.479371 1.331155 27.834425 61.642115 134.009015 344.644105
BR 617.0 1218.133238 7575.293707 51.216883 424.990642 627.417402 975.649259 179250.252900
BZ 617.0 550.632525 2076.371275 257.432377 257.432377 257.432377 257.432377 50092.459300
CB 615.0 77.104151 159.049302 12.499760 23.317567 42.554330 77.310097 2271.436167
CC 614.0 0.688801 0.263994 0.176874 0.563688 0.658715 0.772206 4.103032
CD 617.0 90.251735 51.585130 23.387600 64.724192 79.819104 99.813520 633.534408
CF 617.0 11.241064 13.571133 0.510888 5.066306 9.123000 13.565901 200.967526
CH 617.0 0.030615 0.014808 0.003184 0.023482 0.027860 0.034427 0.224074
CL 617.0 1.403761 1.922210 1.050225 1.050225 1.050225 1.228445 31.688153
CR 617.0 0.742262 0.281195 0.069225 0.589575 0.730800 0.859350 3.039675
CS 617.0 36.917590 17.266347 13.784111 29.782467 34.835130 40.529401 267.942823
CU 617.0 1.383792 0.538717 0.137925 1.070298 1.351665 1.660617 4.951507
CW 617.0 27.165653 14.645993 7.030640 7.030640 36.019104 37.935832 64.521624
DA 617.0 51.128326 21.210888 6.906400 37.942520 49.180940 61.408760 210.330920
DE 617.0 401.901299 317.745623 35.998895 188.815690 307.509595 507.896200 2103.405190
DF 617.0 0.633884 1.912384 0.238680 0.238680 0.238680 0.238680 37.895013
DH 617.0 0.367002 0.112989 0.040995 0.295164 0.358023 0.426348 1.060404
DI 617.0 146.972099 86.084419 60.232470 102.703553 130.050630 165.836955 1049.168078
DL 617.0 94.795377 28.243187 10.345600 78.232240 96.264960 110.640680 326.236200
DN 617.0 26.370568 8.038825 6.339496 20.888264 25.248800 30.544224 62.808096
DU 616.0 1.802900 9.034721 0.005518 0.005518 0.251741 1.058690 161.355315
DV 617.0 1.924830 1.484555 1.743070 1.743070 1.743070 1.743070 25.192930
DY 617.0 26.388989 18.116679 0.804068 14.715792 21.642456 34.058344 152.355164
EB 617.0 9.072700 6.200281 4.926396 5.965392 8.149404 10.503048 94.958580
EE 617.0 3.064778 2.058344 0.286201 1.648679 2.616119 3.910070 18.324926
EG 617.0 1731.248215 1790.227476 185.594100 1111.160625 1493.817413 1905.701475 30243.758780
EH 617.0 0.305107 1.847499 0.003042 0.003042 0.085176 0.237276 42.569748
EL 557.0 69.582596 38.555707 5.394675 30.927468 71.949306 109.125159 109.125159
EP 617.0 105.060712 68.445620 78.526968 78.526968 78.526968 112.766654 1063.594578
EU 617.0 69.117005 390.187057 3.828384 4.324656 22.641144 49.085352 6501.264480
FC 616.0 71.341526 165.551545 7.534128 25.815384 36.394008 56.714448 3030.655824
FD 617.0 6.930086 64.754262 0.296850 0.296850 1.870155 4.880214 1578.654237
FE 617.0 10306.810737 11331.294051 1563.136688 5164.666260 7345.143424 10647.951650 143224.682300
FI 617.0 10.111079 2.934025 3.583450 8.523098 9.945452 11.516657 35.851039
FL 616.0 5.433199 11.496257 0.173229 0.173229 3.028141 6.238814 137.932739
FR 617.0 3.533905 50.181948 0.497060 0.497060 1.131000 1.512060 1244.227020
FS 615.0 0.421501 1.305365 0.067730 0.067730 0.250601 0.535067 31.365763
GB 617.0 20.724856 9.991907 4.102182 14.036718 18.771436 25.608406 135.781294
GE 617.0 131.714987 144.181524 72.611063 72.611063 72.611063 127.591671 1497.351958
GF 617.0 14679.595398 19352.959387 13.038894 2798.992584 7838.273610 19035.709240 143790.071200
GH 617.0 31.489716 9.864239 9.432735 25.034888 30.608946 36.863947 81.210825
GI 617.0 50.584437 36.266251 0.897628 23.011684 41.007968 67.931664 191.194764
GL 616.0 8.530961 10.327010 0.001129 0.124392 0.337827 21.978000 21.978000
Class 617.0 0.175041 0.380310 0.000000 0.000000 0.000000 0.000000 1.000000
In [31]:
# Statistical summary of string (object) data in train dataset

train_df.describe(include='object').T
Out[31]:
count unique top freq
Id 617 617 000ff2bfdfe9 1
EJ 617 2 B 395
In [32]:
train_df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 617 entries, 0 to 616
Data columns (total 58 columns):
 #   Column  Non-Null Count  Dtype  
---  ------  --------------  -----  
 0   Id      617 non-null    object 
 1   AB      617 non-null    float64
 2   AF      617 non-null    float64
 3   AH      617 non-null    float64
 4   AM      617 non-null    float64
 5   AR      617 non-null    float64
 6   AX      617 non-null    float64
 7   AY      617 non-null    float64
 8   AZ      617 non-null    float64
 9   BC      617 non-null    float64
 10  BD      617 non-null    float64
 11  BN      617 non-null    float64
 12  BP      617 non-null    float64
 13  BQ      557 non-null    float64
 14  BR      617 non-null    float64
 15  BZ      617 non-null    float64
 16  CB      615 non-null    float64
 17  CC      614 non-null    float64
 18  CD      617 non-null    float64
 19  CF      617 non-null    float64
 20  CH      617 non-null    float64
 21  CL      617 non-null    float64
 22  CR      617 non-null    float64
 23  CS      617 non-null    float64
 24  CU      617 non-null    float64
 25  CW      617 non-null    float64
 26  DA      617 non-null    float64
 27  DE      617 non-null    float64
 28  DF      617 non-null    float64
 29  DH      617 non-null    float64
 30  DI      617 non-null    float64
 31  DL      617 non-null    float64
 32  DN      617 non-null    float64
 33  DU      616 non-null    float64
 34  DV      617 non-null    float64
 35  DY      617 non-null    float64
 36  EB      617 non-null    float64
 37  EE      617 non-null    float64
 38  EG      617 non-null    float64
 39  EH      617 non-null    float64
 40  EJ      617 non-null    object 
 41  EL      557 non-null    float64
 42  EP      617 non-null    float64
 43  EU      617 non-null    float64
 44  FC      616 non-null    float64
 45  FD      617 non-null    float64
 46  FE      617 non-null    float64
 47  FI      617 non-null    float64
 48  FL      616 non-null    float64
 49  FR      617 non-null    float64
 50  FS      615 non-null    float64
 51  GB      617 non-null    float64
 52  GE      617 non-null    float64
 53  GF      617 non-null    float64
 54  GH      617 non-null    float64
 55  GI      617 non-null    float64
 56  GL      616 non-null    float64
 57  Class   617 non-null    int64  
dtypes: float64(55), int64(1), object(2)
memory usage: 279.7+ KB
In [33]:
# Bar graph representing the null values:

msno.bar(train_df, figsize = (10, 20), fontsize = 8, sort='ascending', color = "#333366",label_rotation=0)
Out[33]:
<Axes: >
In [34]:
# Calculate missing values for each column
missing_values = train_df.isnull().sum()
missing_train = missing_values[missing_values > 0]

# Print columns with missing values
print(missing_train)
BQ    60
CB     2
CC     3
DU     1
EL    60
FC     1
FL     1
FS     2
GL     1
dtype: int64
In [35]:
import pandas as pd
from sklearn.impute import KNNImputer

# Assuming train_df is your DataFrame with missing values

# Separate numerical and categorical columns
numerical_columns = train_df.select_dtypes(include='number').columns
categorical_columns = train_df.select_dtypes(include='object').columns

# Impute missing values in numerical columns using KNN imputer
knn_imputer = KNNImputer(n_neighbors=5)  # You can adjust the value of n_neighbors as per your needs
imputed_numerical_data = knn_imputer.fit_transform(train_df[numerical_columns])
imputed_numerical_df = pd.DataFrame(imputed_numerical_data, columns=numerical_columns)

# Impute missing values in categorical columns with the most frequent value
categorical_imputer = SimpleImputer(strategy='most_frequent')
imputed_categorical_data = categorical_imputer.fit_transform(train_df[categorical_columns])
imputed_categorical_df = pd.DataFrame(imputed_categorical_data, columns=categorical_columns)

# Combine the imputed numerical and categorical DataFrames
imputed_train_df = pd.concat([imputed_numerical_df, imputed_categorical_df], axis=1)
In [36]:
#checking the missing values (deleted later on)
imputed_train_df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 617 entries, 0 to 616
Data columns (total 58 columns):
 #   Column  Non-Null Count  Dtype  
---  ------  --------------  -----  
 0   AB      617 non-null    float64
 1   AF      617 non-null    float64
 2   AH      617 non-null    float64
 3   AM      617 non-null    float64
 4   AR      617 non-null    float64
 5   AX      617 non-null    float64
 6   AY      617 non-null    float64
 7   AZ      617 non-null    float64
 8   BC      617 non-null    float64
 9   BD      617 non-null    float64
 10  BN      617 non-null    float64
 11  BP      617 non-null    float64
 12  BQ      617 non-null    float64
 13  BR      617 non-null    float64
 14  BZ      617 non-null    float64
 15  CB      617 non-null    float64
 16  CC      617 non-null    float64
 17  CD      617 non-null    float64
 18  CF      617 non-null    float64
 19  CH      617 non-null    float64
 20  CL      617 non-null    float64
 21  CR      617 non-null    float64
 22  CS      617 non-null    float64
 23  CU      617 non-null    float64
 24  CW      617 non-null    float64
 25  DA      617 non-null    float64
 26  DE      617 non-null    float64
 27  DF      617 non-null    float64
 28  DH      617 non-null    float64
 29  DI      617 non-null    float64
 30  DL      617 non-null    float64
 31  DN      617 non-null    float64
 32  DU      617 non-null    float64
 33  DV      617 non-null    float64
 34  DY      617 non-null    float64
 35  EB      617 non-null    float64
 36  EE      617 non-null    float64
 37  EG      617 non-null    float64
 38  EH      617 non-null    float64
 39  EL      617 non-null    float64
 40  EP      617 non-null    float64
 41  EU      617 non-null    float64
 42  FC      617 non-null    float64
 43  FD      617 non-null    float64
 44  FE      617 non-null    float64
 45  FI      617 non-null    float64
 46  FL      617 non-null    float64
 47  FR      617 non-null    float64
 48  FS      617 non-null    float64
 49  GB      617 non-null    float64
 50  GE      617 non-null    float64
 51  GF      617 non-null    float64
 52  GH      617 non-null    float64
 53  GI      617 non-null    float64
 54  GL      617 non-null    float64
 55  Class   617 non-null    float64
 56  Id      617 non-null    object 
 57  EJ      617 non-null    object 
dtypes: float64(56), object(2)
memory usage: 279.7+ KB
In [37]:
# Calculate missing values for each column
missing_values = imputed_train_df.isnull().sum()
missing_train = missing_values[missing_values > 0]

# Print columns with missing values
print(missing_train)
Series([], dtype: int64)
In [38]:
# Checking the imputed dataframe of train dataset

imputed_train_df.head()
Out[38]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL Class Id EJ
0 0.209377 3109.03329 85.200147 22.394407 8.138688 0.699861 0.025578 9.812214 5.555634 4126.58731 22.5984 175.638726 152.707705 823.928241 257.432377 47.223358 0.563481 23.387600 4.851915 0.023482 1.050225 0.069225 13.784111 1.302012 36.205956 69.08340 295.570575 0.23868 0.284232 89.245560 84.31664 29.657104 5.310690 1.74307 23.187704 7.294176 1.987283 1433.166750 0.949104 30.879420 78.526968 3.828384 13.394640 10.265073 9028.291921 3.583450 7.298162 1.73855 0.094822 11.339138 72.611063 2003.810319 22.136229 69.834944 0.120343 1.0 000ff2bfdfe9 B
1 0.145282 978.76416 85.200147 36.968889 8.138688 3.632190 0.025578 13.517790 1.229900 5496.92824 19.4205 155.868030 14.754720 51.216883 257.432377 30.284345 0.484710 50.628208 6.085041 0.031442 1.113875 1.117800 28.310953 1.357182 37.476568 70.79836 178.553100 0.23868 0.363489 110.581815 75.74548 37.532000 0.005518 1.74307 17.222328 4.926396 0.858603 1111.287150 0.003042 109.125159 95.415086 52.260480 17.175984 0.296850 6785.003474 10.358927 0.173229 0.49706 0.568932 9.292698 72.611063 27981.562750 29.135430 32.131996 21.978000 0.0 007255e47698 A
2 0.470030 2635.10654 85.200147 32.360553 8.138688 6.732840 0.025578 12.824570 1.229900 5135.78024 26.4825 128.988531 219.320160 482.141594 257.432377 32.563713 0.495852 85.955376 5.376488 0.036218 1.050225 0.700350 39.364743 1.009611 21.459644 70.81970 321.426625 0.23868 0.210441 120.056438 65.46984 28.053464 1.289739 1.74307 36.861352 7.813674 8.146651 1494.076488 0.377208 109.125159 78.526968 5.390628 224.207424 8.745201 8338.906181 11.626917 7.709560 0.97556 1.198821 37.077772 88.609437 13676.957810 28.022851 35.192676 0.196941 0.0 013f2bd269f5 B
3 0.252107 3819.65177 120.201618 77.112203 8.138688 3.685344 0.025578 11.053708 1.229900 4169.67738 23.6577 237.282264 11.050410 661.518640 257.432377 15.201914 0.717882 88.159360 2.347652 0.029054 1.400300 0.636075 41.116960 0.722727 21.530392 47.27586 196.607985 0.23868 0.292431 139.824570 71.57120 24.354856 2.655345 1.74307 52.003884 7.386060 3.813326 15691.552180 0.614484 31.674357 78.526968 31.323372 59.301984 7.884336 10965.766040 14.852022 6.122162 0.49706 0.284466 18.529584 82.416803 2094.262452 39.948656 90.493248 0.155829 0.0 043ac50845d5 B
4 0.380297 3733.04844 85.200147 14.103738 8.138688 3.942255 0.054810 3.396778 102.151980 5728.73412 24.0108 324.546318 149.717165 6074.859475 257.432377 82.213495 0.536467 72.644264 30.537722 0.025472 1.050225 0.693150 31.724726 0.827550 34.415360 74.06532 200.178160 0.23868 0.207708 97.920120 52.83888 26.019912 1.144902 1.74307 9.064856 7.350720 3.490846 1403.656300 0.164268 109.125159 91.994825 51.141336 29.102640 4.274640 16198.049590 13.666727 8.153058 48.50134 0.121914 16.408728 146.109943 8524.370502 45.381316 36.262628 0.096614 1.0 044fb8a146ec B
In [39]:
# Assign the new DataFrame with imputed values to train_df

train_df = imputed_train_df
In [40]:
# Check for duplicate rows in the DataFrame
duplicates_tr = train_df.duplicated()

# Count the number of duplicate rows
num_duplicates = duplicates_tr.sum()

# Print the duplicate rows and the count
print("Duplicate Rows:")
print(f"Number of Duplicate Rows: {num_duplicates}")
Duplicate Rows:
Number of Duplicate Rows: 0
In [41]:
# Assigning columns according to datatypes.

numerical_variables = train_df.select_dtypes(exclude='object').columns
categorical_variables = train_df.select_dtypes(include='object').columns

print ('Numerical columns of train dataset:', numerical_variables)
print ('Categorical columns of train dataset:', categorical_variables)
Numerical columns of train dataset: Index(['AB', 'AF', 'AH', 'AM', 'AR', 'AX', 'AY', 'AZ', 'BC', 'BD ', 'BN', 'BP',
       'BQ', 'BR', 'BZ', 'CB', 'CC', 'CD ', 'CF', 'CH', 'CL', 'CR', 'CS', 'CU',
       'CW ', 'DA', 'DE', 'DF', 'DH', 'DI', 'DL', 'DN', 'DU', 'DV', 'DY', 'EB',
       'EE', 'EG', 'EH', 'EL', 'EP', 'EU', 'FC', 'FD ', 'FE', 'FI', 'FL', 'FR',
       'FS', 'GB', 'GE', 'GF', 'GH', 'GI', 'GL', 'Class'],
      dtype='object')
Categorical columns of train dataset: Index(['Id', 'EJ'], dtype='object')

Visualizing numerical variables¶

In [42]:
# Plotting box plots using plotly.express 

def create_boxplot(data,x,y):
    fig = px.box(data, x=x, y=y, color = x, title = f"Box Plots\n{x} vs {y}")
    fig.show()

for feature in numerical_variables:
    create_boxplot(data=train_df,y=feature,x="Class")
In [43]:
# Distribution plot of each feature of train dataset:

# Number of features to plot
num_plots = len(numerical_variables)

# Calculate the number of rows needed based on the number of plots
num_rows = (num_plots + 4) // 3  

# Create a figure and a grid of subplots
fig, ax = plt.subplots(num_rows, 3, figsize=(10, num_rows * 2), dpi=200)

# Flatten the subplot grid into a 1-dimensional array
ax = ax.flatten()

for i, column in enumerate(numerical_variables):
 if i < num_plots:  # Only plot if there are available subplots
    sns.kdeplot(train_df[column], ax=ax[i], color="#333366")
    
    # Set the subplot title
    ax[i].set_title(f'{column} Distribution', size=8)


# Set the main title for the figure    
fig.suptitle('Distribution of Feature\n\n', fontsize=20, fontweight='bold')
plt.tight_layout()
In [44]:
num_plots = len(numerical_variables)  # Number of features to plot
num_rows = (num_plots + 4) // 5  # Calculate the number of rows needed based on the number of plots

# Create a figure and a grid of subplots
fig, ax = plt.subplots(num_rows, 5, figsize=(10, num_rows * 2), dpi=200)
ax = ax.flatten()  # Flatten the subplot grid into a 1-dimensional array

for i, column in enumerate(numerical_variables):
    if i < num_plots:  # Only plot if there are available subplots
        ax[i].hist(train_df[column], color="#333366", bins=20)
        ax[i].set_title(f'{column} Distribution', size=8)  # Set the subplot title

fig.suptitle('Distribution of Features\n\n', fontsize=20, fontweight='bold')  # Set the main title for the figure
plt.tight_layout()  # Adjust the spacing between subplots
In [45]:
train_df.head(8)
Out[45]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL Class Id EJ
0 0.209377 3109.03329 85.200147 22.394407 8.138688 0.699861 0.025578 9.812214 5.555634 4126.58731 22.5984 175.638726 152.707705 823.928241 257.432377 47.223358 0.563481 23.387600 4.851915 0.023482 1.050225 0.069225 13.784111 1.302012 36.205956 69.08340 295.570575 0.238680 0.284232 89.245560 84.31664 29.657104 5.310690 1.74307 23.187704 7.294176 1.987283 1433.166750 0.949104 30.879420 78.526968 3.828384 13.394640 10.265073 9028.291921 3.583450 7.298162 1.73855 0.094822 11.339138 72.611063 2003.810319 22.136229 69.834944 0.120343 1.0 000ff2bfdfe9 B
1 0.145282 978.76416 85.200147 36.968889 8.138688 3.632190 0.025578 13.517790 1.229900 5496.92824 19.4205 155.868030 14.754720 51.216883 257.432377 30.284345 0.484710 50.628208 6.085041 0.031442 1.113875 1.117800 28.310953 1.357182 37.476568 70.79836 178.553100 0.238680 0.363489 110.581815 75.74548 37.532000 0.005518 1.74307 17.222328 4.926396 0.858603 1111.287150 0.003042 109.125159 95.415086 52.260480 17.175984 0.296850 6785.003474 10.358927 0.173229 0.49706 0.568932 9.292698 72.611063 27981.562750 29.135430 32.131996 21.978000 0.0 007255e47698 A
2 0.470030 2635.10654 85.200147 32.360553 8.138688 6.732840 0.025578 12.824570 1.229900 5135.78024 26.4825 128.988531 219.320160 482.141594 257.432377 32.563713 0.495852 85.955376 5.376488 0.036218 1.050225 0.700350 39.364743 1.009611 21.459644 70.81970 321.426625 0.238680 0.210441 120.056438 65.46984 28.053464 1.289739 1.74307 36.861352 7.813674 8.146651 1494.076488 0.377208 109.125159 78.526968 5.390628 224.207424 8.745201 8338.906181 11.626917 7.709560 0.97556 1.198821 37.077772 88.609437 13676.957810 28.022851 35.192676 0.196941 0.0 013f2bd269f5 B
3 0.252107 3819.65177 120.201618 77.112203 8.138688 3.685344 0.025578 11.053708 1.229900 4169.67738 23.6577 237.282264 11.050410 661.518640 257.432377 15.201914 0.717882 88.159360 2.347652 0.029054 1.400300 0.636075 41.116960 0.722727 21.530392 47.27586 196.607985 0.238680 0.292431 139.824570 71.57120 24.354856 2.655345 1.74307 52.003884 7.386060 3.813326 15691.552180 0.614484 31.674357 78.526968 31.323372 59.301984 7.884336 10965.766040 14.852022 6.122162 0.49706 0.284466 18.529584 82.416803 2094.262452 39.948656 90.493248 0.155829 0.0 043ac50845d5 B
4 0.380297 3733.04844 85.200147 14.103738 8.138688 3.942255 0.054810 3.396778 102.151980 5728.73412 24.0108 324.546318 149.717165 6074.859475 257.432377 82.213495 0.536467 72.644264 30.537722 0.025472 1.050225 0.693150 31.724726 0.827550 34.415360 74.06532 200.178160 0.238680 0.207708 97.920120 52.83888 26.019912 1.144902 1.74307 9.064856 7.350720 3.490846 1403.656300 0.164268 109.125159 91.994825 51.141336 29.102640 4.274640 16198.049590 13.666727 8.153058 48.50134 0.121914 16.408728 146.109943 8524.370502 45.381316 36.262628 0.096614 1.0 044fb8a146ec B
5 0.209377 2615.81430 85.200147 8.541526 8.138688 4.013127 0.025578 12.547282 1.229900 5237.54088 10.2399 148.487931 16.526120 642.325163 257.432377 18.382000 0.639460 80.667400 14.688030 0.016716 1.050225 0.857625 32.456996 1.390284 7.030640 55.22404 135.489250 0.238680 0.478275 135.317865 81.46312 31.731600 0.005518 1.74307 16.773128 4.926396 2.394414 866.382950 0.003042 109.125159 78.526968 3.828384 23.304960 0.296850 8517.278846 10.981896 0.173229 0.49706 1.164956 21.915512 72.611063 24177.595550 28.525186 82.527764 21.978000 0.0 04517a3c90bd A
6 0.348249 1733.65412 85.200147 8.377385 15.312480 1.913544 0.025578 6.547778 1.229900 5710.46099 17.6550 143.646993 344.644105 719.725142 257.432377 38.455144 0.946323 78.304856 13.184256 0.033631 1.050225 0.610950 13.784111 2.786085 21.877508 19.21570 107.907985 1.318005 0.460510 176.625563 97.07586 44.506128 1.006962 1.74307 4.474032 4.926396 2.620150 1793.612375 0.097344 13.214487 78.526968 26.304948 48.266400 1.460502 3903.806766 10.777915 4.408484 0.86130 0.467337 17.878444 192.453107 3332.467494 34.166222 100.086808 0.065096 0.0 049232ca8356 B
7 0.269199 966.45483 85.200147 21.174189 8.138688 4.987617 0.025578 9.408886 1.229900 5040.77914 20.8329 170.051724 6.199900 701.018610 257.432377 12.867400 0.771909 71.542272 24.910352 0.029850 1.050225 1.109625 41.932918 1.186155 42.113460 63.21684 326.225295 0.238680 0.325227 83.769368 73.98800 19.905608 2.117379 1.74307 15.870236 8.354376 3.277203 767.720563 0.292032 15.089217 104.985328 5.104512 37.556400 4.518057 18090.349450 10.342388 6.591896 0.49706 0.277693 18.445866 109.693986 21371.759850 35.208102 31.424696 0.092873 0.0 057287f2da6d B
In [141]:
# Converting categorical columns in numerical by Label encoding:

from sklearn.preprocessing import LabelEncoder

# Create a LabelEncoder object
label_encoder = LabelEncoder()

# Fit and transform the 'EJ' column in the train dataset
train_df['EJ'] = label_encoder.fit_transform(train_df['EJ'])

# Print the encoded DataFrame
train_df.head()
Out[141]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL Class EJ
0 0.209377 3109.03329 85.200147 22.394407 8.138688 0.699861 0.025578 9.812214 5.555634 4126.58731 22.5984 175.638726 152.707705 823.928241 257.432377 47.223358 0.563481 23.387600 4.851915 0.023482 1.050225 0.069225 13.784111 1.302012 36.205956 69.08340 295.570575 0.23868 0.284232 89.245560 84.31664 29.657104 5.310690 1.74307 23.187704 7.294176 1.987283 1433.166750 0.949104 30.879420 78.526968 3.828384 13.394640 10.265073 9028.291921 3.583450 7.298162 1.73855 0.094822 11.339138 72.611063 2003.810319 22.136229 69.834944 0.120343 1.0 1
1 0.145282 978.76416 85.200147 36.968889 8.138688 3.632190 0.025578 13.517790 1.229900 5496.92824 19.4205 155.868030 14.754720 51.216883 257.432377 30.284345 0.484710 50.628208 6.085041 0.031442 1.113875 1.117800 28.310953 1.357182 37.476568 70.79836 178.553100 0.23868 0.363489 110.581815 75.74548 37.532000 0.005518 1.74307 17.222328 4.926396 0.858603 1111.287150 0.003042 109.125159 95.415086 52.260480 17.175984 0.296850 6785.003474 10.358927 0.173229 0.49706 0.568932 9.292698 72.611063 27981.562750 29.135430 32.131996 21.978000 0.0 0
2 0.470030 2635.10654 85.200147 32.360553 8.138688 6.732840 0.025578 12.824570 1.229900 5135.78024 26.4825 128.988531 219.320160 482.141594 257.432377 32.563713 0.495852 85.955376 5.376488 0.036218 1.050225 0.700350 39.364743 1.009611 21.459644 70.81970 321.426625 0.23868 0.210441 120.056438 65.46984 28.053464 1.289739 1.74307 36.861352 7.813674 8.146651 1494.076488 0.377208 109.125159 78.526968 5.390628 224.207424 8.745201 8338.906181 11.626917 7.709560 0.97556 1.198821 37.077772 88.609437 13676.957810 28.022851 35.192676 0.196941 0.0 1
3 0.252107 3819.65177 120.201618 77.112203 8.138688 3.685344 0.025578 11.053708 1.229900 4169.67738 23.6577 237.282264 11.050410 661.518640 257.432377 15.201914 0.717882 88.159360 2.347652 0.029054 1.400300 0.636075 41.116960 0.722727 21.530392 47.27586 196.607985 0.23868 0.292431 139.824570 71.57120 24.354856 2.655345 1.74307 52.003884 7.386060 3.813326 15691.552180 0.614484 31.674357 78.526968 31.323372 59.301984 7.884336 10965.766040 14.852022 6.122162 0.49706 0.284466 18.529584 82.416803 2094.262452 39.948656 90.493248 0.155829 0.0 1
4 0.380297 3733.04844 85.200147 14.103738 8.138688 3.942255 0.054810 3.396778 102.151980 5728.73412 24.0108 324.546318 149.717165 6074.859475 257.432377 82.213495 0.536467 72.644264 30.537722 0.025472 1.050225 0.693150 31.724726 0.827550 34.415360 74.06532 200.178160 0.23868 0.207708 97.920120 52.83888 26.019912 1.144902 1.74307 9.064856 7.350720 3.490846 1403.656300 0.164268 109.125159 91.994825 51.141336 29.102640 4.274640 16198.049590 13.666727 8.153058 48.50134 0.121914 16.408728 146.109943 8524.370502 45.381316 36.262628 0.096614 1.0 1
In [47]:
# Checking the distribution of 'Class' variable(output variable)
class_distribution = train_df['Class'].value_counts()

# Plotting the pie chart
plt.figure(figsize=(8, 6))
plt.pie(class_distribution, labels=class_distribution.index, autopct='%1.1f%%')
plt.title('Distribution of Output Class')
plt.show()
In [48]:
# Dropping the'Id' column

train_df = train_df.drop('Id', axis=1)
In [49]:
train_df.head()
Out[49]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL Class EJ
0 0.209377 3109.03329 85.200147 22.394407 8.138688 0.699861 0.025578 9.812214 5.555634 4126.58731 22.5984 175.638726 152.707705 823.928241 257.432377 47.223358 0.563481 23.387600 4.851915 0.023482 1.050225 0.069225 13.784111 1.302012 36.205956 69.08340 295.570575 0.23868 0.284232 89.245560 84.31664 29.657104 5.310690 1.74307 23.187704 7.294176 1.987283 1433.166750 0.949104 30.879420 78.526968 3.828384 13.394640 10.265073 9028.291921 3.583450 7.298162 1.73855 0.094822 11.339138 72.611063 2003.810319 22.136229 69.834944 0.120343 1.0 1
1 0.145282 978.76416 85.200147 36.968889 8.138688 3.632190 0.025578 13.517790 1.229900 5496.92824 19.4205 155.868030 14.754720 51.216883 257.432377 30.284345 0.484710 50.628208 6.085041 0.031442 1.113875 1.117800 28.310953 1.357182 37.476568 70.79836 178.553100 0.23868 0.363489 110.581815 75.74548 37.532000 0.005518 1.74307 17.222328 4.926396 0.858603 1111.287150 0.003042 109.125159 95.415086 52.260480 17.175984 0.296850 6785.003474 10.358927 0.173229 0.49706 0.568932 9.292698 72.611063 27981.562750 29.135430 32.131996 21.978000 0.0 0
2 0.470030 2635.10654 85.200147 32.360553 8.138688 6.732840 0.025578 12.824570 1.229900 5135.78024 26.4825 128.988531 219.320160 482.141594 257.432377 32.563713 0.495852 85.955376 5.376488 0.036218 1.050225 0.700350 39.364743 1.009611 21.459644 70.81970 321.426625 0.23868 0.210441 120.056438 65.46984 28.053464 1.289739 1.74307 36.861352 7.813674 8.146651 1494.076488 0.377208 109.125159 78.526968 5.390628 224.207424 8.745201 8338.906181 11.626917 7.709560 0.97556 1.198821 37.077772 88.609437 13676.957810 28.022851 35.192676 0.196941 0.0 1
3 0.252107 3819.65177 120.201618 77.112203 8.138688 3.685344 0.025578 11.053708 1.229900 4169.67738 23.6577 237.282264 11.050410 661.518640 257.432377 15.201914 0.717882 88.159360 2.347652 0.029054 1.400300 0.636075 41.116960 0.722727 21.530392 47.27586 196.607985 0.23868 0.292431 139.824570 71.57120 24.354856 2.655345 1.74307 52.003884 7.386060 3.813326 15691.552180 0.614484 31.674357 78.526968 31.323372 59.301984 7.884336 10965.766040 14.852022 6.122162 0.49706 0.284466 18.529584 82.416803 2094.262452 39.948656 90.493248 0.155829 0.0 1
4 0.380297 3733.04844 85.200147 14.103738 8.138688 3.942255 0.054810 3.396778 102.151980 5728.73412 24.0108 324.546318 149.717165 6074.859475 257.432377 82.213495 0.536467 72.644264 30.537722 0.025472 1.050225 0.693150 31.724726 0.827550 34.415360 74.06532 200.178160 0.23868 0.207708 97.920120 52.83888 26.019912 1.144902 1.74307 9.064856 7.350720 3.490846 1403.656300 0.164268 109.125159 91.994825 51.141336 29.102640 4.274640 16198.049590 13.666727 8.153058 48.50134 0.121914 16.408728 146.109943 8524.370502 45.381316 36.262628 0.096614 1.0 1
In [50]:
train_df.shape
Out[50]:
(617, 57)

Splitting train data into Training and Validation sets¶

In [51]:
# Split the dataframe into features (x) and target variable (y)

x = train_df.drop('Class', axis=1) # features
y = train_df['Class'] # Target variable
In [52]:
# Stratified Sampling: Stratified sampling is used when the dataset is imbalanced or when you want to ensure that each class
# is represented in both the training and test sets
In [53]:
# Splitting the train data into training & validation sets using stratified sampling

x_train, x_valid, y_train, y_valid = train_test_split(x, y, test_size=0.30, stratify=y, random_state=42)
In [54]:
# Print the shapes of the resulting datasets

print("x_train shape:", x_train.shape)
print("x_valid shape:", x_valid.shape)
print("y_train shape:", y_train.shape)
print("y_valid shape:", y_valid.shape)
x_train shape: (431, 56)
x_valid shape: (186, 56)
y_train shape: (431,)
y_valid shape: (186,)

Scaling the feature set (x) of training dataset using Standard Scaler¶

In [55]:
from sklearn.preprocessing import StandardScaler

# Assuming x_train and x_test are the feature sets

# Create the scaler object
scaler = StandardScaler()

# Fit the scaler on the training set
scaler.fit(x_train)

# Transform the training set
x_train_scaled = scaler.transform(x_train)

# Transform the testing set using the same scaler
x_valid_scaled = scaler.transform(x_valid)
In [56]:
# Converting scaled arrays to respective dataframes

x_train_scaled = pd.DataFrame(x_train_scaled, columns=x_train.columns)
x_valid_scaled = pd.DataFrame(x_valid_scaled, columns=x_valid.columns)
In [57]:
x_train_scaled.head()
Out[57]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL EJ
0 -0.578847 -0.813207 -0.246596 -0.485677 -0.189234 -0.950187 -0.084117 -0.250628 -0.106195 -0.611691 -0.946420 -0.725135 -0.991933 -0.231473 -0.130952 -0.437169 -0.551490 -0.739397 -0.221702 -0.917126 -0.166308 -0.656098 -0.209843 0.185218 -1.327523 -0.165398 -0.254580 -0.166785 0.736573 0.337447 -0.440000 -0.712331 -0.217917 -0.114165 -0.169187 -0.514413 -1.085553 -0.357101 -0.427051 -0.784791 -0.420340 -0.068096 -0.216354 -0.379963 3.917970 -0.002263 -0.480085 -0.065843 -0.116128 -0.501240 -0.254094 0.402761 -0.202064 0.897986 1.301265 -1.334408
1 0.154287 1.373691 -0.246596 -0.197105 -0.203506 -0.215006 -0.084117 -1.158915 -0.106195 -0.264544 0.268795 -0.347287 -1.016493 0.852010 -0.130952 -0.425427 0.131706 -0.641582 -0.404985 0.398658 -0.166308 -0.722531 0.164961 -0.008372 0.748316 0.751784 -0.809338 -0.181093 -1.038729 0.248368 -0.634861 -2.002368 -0.217917 -0.114165 3.881630 -0.369712 0.903865 -0.123023 -0.427051 1.102205 0.009418 -0.281041 0.050020 -0.379963 2.328843 1.330019 -0.480085 -0.052777 -0.173539 0.961042 -0.396663 -0.755450 0.118290 1.294184 1.301265 -1.334408
2 -0.527698 0.103716 -0.246596 -0.426589 -0.203506 0.612901 -0.038771 -1.739376 -0.106195 0.149485 0.876402 -0.211431 1.184260 -0.170499 0.260843 -0.228503 -1.510491 -0.390058 -0.133024 -0.637172 -0.166308 -0.273908 -0.609973 0.443338 -1.327523 0.507370 -0.895136 -0.181093 -0.211876 -0.391073 0.548620 -0.390940 0.011326 -0.114165 -0.060066 0.244356 -0.799612 -0.207139 0.445951 1.102205 0.192743 -0.304459 -0.412353 0.190773 4.684170 -0.672104 -0.006608 -0.045489 -0.054302 -1.012883 0.028176 -0.669144 -1.294994 -0.831437 -0.815500 0.749396
3 -0.433925 0.772965 -0.246596 -0.401710 -0.203506 -1.076029 -0.084117 -1.739376 -0.090624 0.796538 0.775134 0.620209 -0.997453 0.094638 0.034819 -0.218073 -0.037031 -0.960002 -0.677441 -1.253070 -0.166308 -0.856721 1.693993 -1.438786 -0.428232 2.004963 -0.855011 -0.181093 -1.306240 -0.114855 -0.594843 -1.648749 0.220964 -0.114165 -0.194122 0.775130 0.350235 0.185266 0.533880 -1.404094 -0.420340 0.044610 -0.313442 0.448904 2.767541 -0.986670 0.383630 -0.061318 -0.239782 -0.259681 -0.396663 -0.629923 -1.372102 -0.400550 -0.822730 0.749396
4 -0.050308 -0.519493 -0.183946 0.029572 -0.203506 0.616212 -0.084117 -1.046131 -0.106195 -0.942073 -0.440081 -0.635050 -0.750640 0.887792 -0.130952 -0.382460 -0.512117 0.115519 -0.053980 0.174695 -0.166308 0.421128 -1.506004 -0.793486 -1.327523 -1.299858 0.858791 -0.181093 -0.673941 -0.616229 -0.127236 -0.202941 -0.125521 -0.114165 0.563308 -0.698026 0.565198 -0.837864 -0.295159 -0.791123 -0.420340 -0.230367 -0.399638 -0.328584 -0.578600 -1.069938 -0.225156 -0.065843 -0.239782 -0.908594 -0.396663 -0.740521 2.121993 -1.108181 -0.825925 0.749396

Using SMOTE method, oversampling the training set.¶

In [58]:
# Checking the size of splitted data:

print(type(x_train_scaled))
print(x_train_scaled.shape)

print(type(y_train))
print(y_train.shape)
<class 'pandas.core.frame.DataFrame'>
(431, 56)
<class 'pandas.core.series.Series'>
(431,)
In [59]:
x_train_scaled
Out[59]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL EJ
0 -0.578847 -0.813207 -0.246596 -0.485677 -0.189234 -0.950187 -0.084117 -0.250628 -0.106195 -0.611691 -0.946420 -0.725135 -0.991933 -0.231473 -0.130952 -0.437169 -0.551490 -0.739397 -0.221702 -0.917126 -0.166308 -0.656098 -0.209843 0.185218 -1.327523 -0.165398 -0.254580 -0.166785 0.736573 0.337447 -0.440000 -0.712331 -0.217917 -0.114165 -0.169187 -0.514413 -1.085553 -0.357101 -0.427051 -0.784791 -0.420340 -0.068096 -0.216354 -0.379963 3.917970 -0.002263 -0.480085 -0.065843 -0.116128 -0.501240 -0.254094 0.402761 -0.202064 0.897986 1.301265 -1.334408
1 0.154287 1.373691 -0.246596 -0.197105 -0.203506 -0.215006 -0.084117 -1.158915 -0.106195 -0.264544 0.268795 -0.347287 -1.016493 0.852010 -0.130952 -0.425427 0.131706 -0.641582 -0.404985 0.398658 -0.166308 -0.722531 0.164961 -0.008372 0.748316 0.751784 -0.809338 -0.181093 -1.038729 0.248368 -0.634861 -2.002368 -0.217917 -0.114165 3.881630 -0.369712 0.903865 -0.123023 -0.427051 1.102205 0.009418 -0.281041 0.050020 -0.379963 2.328843 1.330019 -0.480085 -0.052777 -0.173539 0.961042 -0.396663 -0.755450 0.118290 1.294184 1.301265 -1.334408
2 -0.527698 0.103716 -0.246596 -0.426589 -0.203506 0.612901 -0.038771 -1.739376 -0.106195 0.149485 0.876402 -0.211431 1.184260 -0.170499 0.260843 -0.228503 -1.510491 -0.390058 -0.133024 -0.637172 -0.166308 -0.273908 -0.609973 0.443338 -1.327523 0.507370 -0.895136 -0.181093 -0.211876 -0.391073 0.548620 -0.390940 0.011326 -0.114165 -0.060066 0.244356 -0.799612 -0.207139 0.445951 1.102205 0.192743 -0.304459 -0.412353 0.190773 4.684170 -0.672104 -0.006608 -0.045489 -0.054302 -1.012883 0.028176 -0.669144 -1.294994 -0.831437 -0.815500 0.749396
3 -0.433925 0.772965 -0.246596 -0.401710 -0.203506 -1.076029 -0.084117 -1.739376 -0.090624 0.796538 0.775134 0.620209 -0.997453 0.094638 0.034819 -0.218073 -0.037031 -0.960002 -0.677441 -1.253070 -0.166308 -0.856721 1.693993 -1.438786 -0.428232 2.004963 -0.855011 -0.181093 -1.306240 -0.114855 -0.594843 -1.648749 0.220964 -0.114165 -0.194122 0.775130 0.350235 0.185266 0.533880 -1.404094 -0.420340 0.044610 -0.313442 0.448904 2.767541 -0.986670 0.383630 -0.061318 -0.239782 -0.259681 -0.396663 -0.629923 -1.372102 -0.400550 -0.822730 0.749396
4 -0.050308 -0.519493 -0.183946 0.029572 -0.203506 0.616212 -0.084117 -1.046131 -0.106195 -0.942073 -0.440081 -0.635050 -0.750640 0.887792 -0.130952 -0.382460 -0.512117 0.115519 -0.053980 0.174695 -0.166308 0.421128 -1.506004 -0.793486 -1.327523 -1.299858 0.858791 -0.181093 -0.673941 -0.616229 -0.127236 -0.202941 -0.125521 -0.114165 0.563308 -0.698026 0.565198 -0.837864 -0.295159 -0.791123 -0.420340 -0.230367 -0.399638 -0.328584 -0.578600 -1.069938 -0.225156 -0.065843 -0.239782 -0.908594 -0.396663 -0.740521 2.121993 -1.108181 -0.825925 0.749396
5 -0.493599 -0.372892 -0.148156 0.086845 -0.203506 1.656063 -0.084117 1.845649 -0.106195 0.752954 -0.136277 0.179512 -0.593107 0.061663 0.260671 -0.260777 0.131682 0.316055 -0.637178 -0.581181 0.105434 -1.425242 0.694937 1.841486 -1.327523 0.821368 0.890185 -0.181093 0.006997 0.614259 -0.503212 -0.259341 -0.217917 -0.114165 -0.810830 -0.030454 -0.300736 0.027032 -0.427051 1.102205 -0.420340 -0.079224 0.597405 -0.379963 -0.260700 -0.248364 -0.480085 -0.057171 -0.217701 1.699537 -0.396663 1.727410 -1.140032 0.197489 1.301265 -1.334408
6 0.103138 0.262438 -0.246596 -0.201165 -0.203506 -0.099099 -0.084117 0.994506 -0.050880 -0.173295 0.775134 -0.261449 0.859735 0.309910 -0.130952 -0.015509 -0.303474 -1.250474 4.781053 -0.133254 -0.166308 0.033644 -0.088515 -0.535366 -1.327523 -0.062283 -0.361939 -0.181093 -0.138918 -0.257560 0.213806 -0.414217 -0.058553 -0.114165 -1.400381 0.081415 -1.026742 -0.041610 0.282656 -0.021718 -0.420340 -0.304459 -0.393724 0.152394 -0.153704 -2.200526 -0.026547 -0.024844 0.069352 -0.134890 -0.396663 -0.306927 1.320363 1.629745 -0.812611 0.749396
7 0.213961 0.080134 0.081135 -0.274809 -0.203506 -0.218318 -0.084117 1.597525 -0.106195 0.534563 0.167527 0.025421 0.224064 0.008408 -0.130952 0.267216 0.327473 -0.066286 -0.065095 -0.385213 -0.017041 0.237179 0.153105 -0.503101 0.720467 0.971292 0.302690 -0.181093 -0.892814 -0.144993 0.080797 -0.690845 0.181171 -0.114165 0.670454 -0.067542 -0.073606 0.399997 0.157044 -1.166922 -0.420340 -0.190574 1.199544 0.229771 -0.530029 -0.551829 0.528207 -0.036521 0.268080 0.367840 -0.233196 -0.529183 0.795876 -0.600082 -0.825954 0.749396
8 -0.510648 -0.843439 -0.246596 -0.396268 -0.203506 -0.860773 -0.070636 -0.158897 -0.106195 -0.651438 -1.655296 -0.427256 -0.726725 0.098616 0.203314 0.916565 -0.634510 -0.192346 -0.854302 -0.189245 -0.166308 -0.134160 -0.409448 -0.395552 1.295435 0.396793 -0.476378 -0.181093 0.031316 -0.376290 0.469086 -0.284407 -0.217917 -0.114165 -0.392367 -0.143540 -0.998351 -0.238733 -0.427051 -0.346208 -0.130500 -0.304459 -0.218621 -0.379963 -0.531535 -0.596238 -0.480085 -0.044898 -0.239782 0.555026 -0.396663 1.821916 0.419274 -0.822834 1.301265 -1.334408
9 -0.519173 -0.461430 -0.246596 -0.289322 -0.203506 -0.751489 0.085010 -0.701764 -0.076410 -0.434568 -1.250224 -0.629755 -1.052292 -0.203587 -0.130952 -0.437169 -0.877615 -0.219996 -0.673612 0.538635 -0.166308 0.961328 0.544140 -0.148187 0.703984 0.141040 -0.973466 -0.181093 -1.014410 -0.019464 -0.226312 -1.094597 -0.217917 -0.114165 1.702909 -0.105845 -0.740801 -0.467864 -0.427051 1.102205 0.977453 -0.244347 -0.232716 -0.379963 0.094599 0.280847 -0.480085 -0.065843 -0.239782 -0.906811 -0.396663 -0.171890 0.418156 -0.689253 1.301265 -1.334408
10 0.682826 0.010062 0.515484 -0.318305 -0.203506 0.215505 -0.084117 0.568935 -0.106195 0.020276 0.876402 -0.018728 0.398212 -0.217860 -0.130952 -0.228110 -0.621657 -0.247200 -0.698560 -0.413208 0.059506 0.916069 -0.196836 -0.352532 -1.327523 0.982049 -0.384091 -0.181093 0.858169 -0.217383 0.585129 -0.215474 -0.208988 -0.114165 0.691686 0.160454 0.222474 -0.428224 0.043993 -1.108934 -0.420340 -0.186841 -0.281309 0.033542 -0.112457 1.972104 -0.359805 -0.065843 -0.239782 0.119149 -0.396663 -0.522176 1.489108 -0.382226 -0.613386 0.749396
11 -0.549010 -0.066405 -0.217147 -0.368171 -0.203506 -0.607434 0.072141 0.024565 -0.019194 -0.146026 0.268795 -0.160619 -0.993257 -0.151082 -0.130952 -0.068085 -0.159779 -0.489211 -0.120056 -0.287229 -0.152912 -0.299581 1.025653 -0.062147 0.628552 -0.591426 0.324727 -0.181093 0.371785 0.286088 0.209260 -0.959864 -0.217917 -0.114165 0.759701 0.068648 -0.783388 -0.406643 -0.427051 0.466913 3.467575 -0.024570 -0.342864 -0.379963 3.105926 -0.196554 -0.480085 -0.039430 0.003109 -0.100572 -0.396663 -0.501335 1.838331 0.023413 1.301265 -1.334408
12 0.572003 -0.033079 -0.246596 0.557994 -0.203506 -0.499806 -0.084117 1.011048 -0.106195 0.507061 -0.440081 -0.192059 -0.775198 -0.052822 -0.130952 -0.281637 0.737280 1.270124 -0.687568 0.426654 -0.108898 0.168893 -0.140086 0.034648 -1.327523 -1.070563 -0.648943 -0.181093 -0.819856 0.705244 0.037630 2.455915 -0.217917 -0.114165 -1.052279 0.790330 1.579172 -0.361040 -0.427051 1.102205 -0.420340 1.685083 -0.236462 -0.379963 -0.345110 -0.742419 -0.480085 -0.050028 -0.239782 -1.553050 0.591823 -0.349253 2.209904 2.546141 1.301265 -1.334408
13 -0.485074 -0.245120 0.457185 -0.034095 -0.203506 0.490370 -0.084117 2.926871 -0.015664 0.195115 -0.541349 -0.072563 -0.730612 -0.159863 -0.130952 8.195646 -0.432636 1.316356 0.002833 0.006723 0.013578 0.686861 0.332794 -1.298971 0.676825 0.249389 -0.196615 -0.181093 -2.011497 -0.986462 -1.521540 -0.687264 -0.217917 -0.114165 -0.347435 -0.197043 -0.152696 0.181600 -0.427051 -0.726004 0.307879 -0.073695 2.383467 -0.379963 0.122313 0.106910 -0.480085 -0.054383 0.091433 -0.621574 -0.396663 1.080484 -0.094038 -1.084961 1.301265 -1.334408
14 -0.459499 -0.574834 -0.246596 -0.348496 -0.203506 0.298296 -0.084117 0.964431 -0.106195 -0.047181 -0.541349 0.127067 -0.837289 -0.151300 -0.130952 -0.435726 0.093481 -0.049414 -0.569743 0.034718 -0.166308 0.089226 -0.555410 -0.072902 0.640763 2.884833 1.134054 -0.181093 -1.136006 -0.225664 -0.755127 -0.527017 -0.217917 -0.114165 -0.727137 -0.560620 -0.428497 0.052463 -0.427051 0.034310 -0.420340 -0.067251 -0.358831 -0.379963 -0.241417 0.399272 -0.480085 -0.055313 -0.239782 0.154804 -0.396663 1.574350 0.654324 -0.337571 1.301265 -1.334408
15 -0.127032 0.797323 -0.246596 -0.082903 -0.203506 0.179078 -0.084117 0.681719 -0.070797 0.261206 0.673866 0.024274 1.644121 -0.203182 -0.130952 -0.010064 -0.303474 0.916176 -0.131789 0.482645 -0.032350 -0.267291 -0.053752 -1.922760 0.641596 2.995702 0.327904 -0.181093 -1.111686 0.520435 -1.647176 0.664545 -0.217917 -0.114165 0.016467 -0.023159 0.064294 0.329671 -0.427051 -0.378591 -0.420340 -0.069575 -0.217192 -0.379963 -0.487441 0.056950 -0.480085 -0.047182 0.219502 1.413410 -0.396663 1.432323 0.759371 1.242430 1.301265 -1.334408
16 -0.340152 0.298213 -0.246596 -0.344004 -0.203506 0.275115 -0.084117 0.666681 -0.106195 0.642492 0.268795 -0.264582 1.118972 -0.219862 -0.130952 0.003580 -0.481856 -0.477616 -0.063119 0.006723 0.028887 -0.027496 0.071490 0.368053 -0.056406 2.046151 -0.626603 -0.181093 1.004084 -0.659455 0.946358 0.535630 -0.104169 -0.114165 -0.645914 -0.683434 -0.440665 -0.009861 -0.031374 1.102205 0.558777 -0.195363 -0.319454 -0.093357 -0.470009 0.465886 -0.097475 -0.065843 -0.116128 -0.309597 -0.396663 -0.602621 0.799229 -0.311239 -0.816894 0.749396
17 -0.144081 1.051117 -0.190398 -0.284484 -0.203506 1.960733 -0.001392 -0.115287 -0.084422 -0.213238 -0.237545 -0.426351 0.029997 -0.154101 -0.130952 -0.400303 0.162275 -0.627683 0.414357 0.482645 -0.166308 -0.920772 -0.054212 -0.331022 0.984436 1.337525 0.109531 0.066310 -0.382110 0.018383 1.045793 -0.287988 -0.077964 -0.114165 -0.020812 0.024873 0.697015 -0.501740 0.094238 0.490752 0.349362 -0.150816 -0.260709 -0.144117 -0.642311 -0.727616 -0.213453 -0.065843 -0.120545 -0.719624 0.833334 -0.541809 1.126660 -0.588508 -0.815829 0.749396
18 -0.621471 -0.376772 -0.246596 -0.455442 -0.203506 -1.821145 -0.084117 -0.602514 0.110900 -0.666144 -1.756564 0.307679 -0.907070 -0.174392 -0.130952 0.135628 -0.497584 -1.250474 -0.865418 -0.217240 -0.166308 0.149307 -0.507753 0.421828 0.729800 -0.155706 -0.582381 -0.181093 0.177231 -0.497731 0.414250 1.493535 -0.217917 -0.114165 -1.400381 -0.698026 -0.633320 -0.299953 -0.427051 -0.435262 -0.420340 -0.304459 -0.395892 -0.379963 -0.175158 -2.200526 -0.480085 -0.055293 -0.230949 -0.909485 -0.396663 0.531907 2.958639 -0.224726 1.301265 -1.334408
19 -0.229329 0.178803 -0.246596 -0.091153 -0.203506 -0.821034 -0.041222 -0.464166 -0.084377 -0.176442 0.673866 -0.645265 0.116922 -0.128732 -0.130952 1.023817 0.000639 0.583040 -0.127837 -0.301227 -0.166308 0.568551 -0.144230 0.389563 -1.327523 1.098150 -0.397242 -0.181093 -0.673941 -0.351511 0.764598 2.191820 -0.019732 -0.114165 -0.343732 -0.698026 -0.884786 -0.287283 0.408268 1.102205 0.035124 -0.304459 -0.267412 0.299101 0.327248 -0.422301 0.065272 -0.052748 -0.111712 -0.571658 -0.396663 -0.297002 -0.630073 -0.608755 -0.813685 0.749396
20 0.111663 0.662252 -0.199177 -0.434191 -0.203506 0.535077 -0.084117 -0.127318 0.069074 0.015557 0.977670 0.762540 2.648425 -0.102684 0.211834 -0.390070 -0.955723 0.690295 -0.165259 1.182530 -0.032350 -0.273378 -0.219628 0.416450 0.293453 -0.873927 -0.608509 -0.181093 -0.163238 -0.079676 0.429176 0.659621 -0.217917 -0.114165 4.246767 0.318531 -1.389745 -0.420465 -0.427051 1.102205 0.559239 0.015998 -0.276529 -0.379963 -0.066699 -0.391770 -0.480085 0.687176 -0.239782 1.408953 -0.396663 -0.733277 1.461170 0.284142 1.301265 -1.334408
21 -0.681144 -0.547013 -0.246596 -0.113008 -0.203506 0.169143 -0.084117 -1.739376 -0.106195 -0.279279 0.167527 0.183748 -0.559871 -0.321829 -0.130952 -0.385478 -0.457346 -0.255822 -0.787732 0.874580 -0.166308 -0.286083 0.011632 -0.599896 1.272691 1.131101 0.508865 -0.181093 -0.844175 -0.986462 -0.692202 -0.738293 -0.217723 -0.114165 -0.389158 -0.598316 -1.097720 -0.250137 -0.282598 -1.144844 0.365909 0.009201 -0.305064 -0.280301 -0.380485 -0.479663 -0.456503 -0.058637 -0.058718 0.620096 -0.396663 -0.559430 -1.007047 -1.067827 -0.140490 0.749396
22 0.120188 -0.939664 -0.246596 -0.104845 -0.203506 -0.542857 0.127904 0.424572 -0.069530 -0.352582 -0.440081 -0.417813 -1.019929 -0.115651 0.309497 -0.022331 0.144712 0.050111 -0.064848 0.622622 -0.020868 -0.336636 0.266260 0.228238 0.894392 0.603896 0.780152 -0.181093 0.614977 0.310465 -0.434846 -0.173398 -0.217917 -0.114165 0.884993 -0.226227 1.240505 -0.570470 -0.427051 1.102205 0.778428 0.039416 1.236161 -0.379963 -0.400215 -0.063325 -0.480085 -0.065843 -0.239782 -0.851546 -0.396663 0.736072 -0.504166 0.085378 1.301265 -1.334408
23 0.120188 0.305847 -0.246596 0.120406 -0.203506 -0.377276 -0.084117 0.346375 -0.106195 0.015730 0.977670 -0.150116 2.074014 -0.102240 -0.130952 0.769692 -0.877615 0.406140 0.388050 -0.217240 -0.166308 -0.204563 -0.050183 -0.449327 0.632935 0.267415 -0.082781 -0.181093 -0.844175 -0.292294 -1.036680 1.900868 -0.007114 -0.114165 -0.833049 0.005417 -0.931429 -0.017827 0.596686 -1.488787 -0.420340 -0.304459 -0.333796 0.370908 0.192873 -0.614742 0.024686 -0.054199 -0.085215 0.276029 -0.396663 0.448426 -0.976875 -0.539796 -0.810841 0.749396
24 -0.058833 0.071624 -0.189723 0.318057 -0.203506 1.162631 -0.084117 -0.394991 -0.106195 0.249973 1.180206 -0.052441 -0.718585 -0.122698 -0.130952 0.121328 -1.216438 -1.250474 1.414760 0.118704 -0.166308 0.067787 -0.297444 0.088423 0.679443 -0.289058 1.335917 -0.181093 -0.163238 -0.571498 0.245304 0.785402 -0.217917 -0.114165 -1.400381 0.035816 -0.661711 -0.210658 -0.427051 1.102205 0.020501 -0.304459 1.058250 -0.379963 -0.494185 -2.200526 -0.480085 -0.049409 -0.058718 -0.850655 -0.396663 1.643507 0.731060 -0.963689 1.301265 -1.334408
25 -0.476549 -1.126763 0.467014 0.428243 -0.190376 0.553291 0.255362 -0.086715 -0.057218 -0.617654 -0.136277 0.006623 2.140964 -0.321829 -0.130952 -0.233095 0.218595 -0.363151 0.389903 -0.161250 0.162845 0.367134 0.059058 1.475817 1.008125 0.736472 0.751203 -0.181093 2.268682 -0.055299 -0.059013 0.727211 -0.217917 -0.114165 -0.400267 -0.302833 -0.793528 -0.099182 -0.427051 1.102205 2.160829 -0.297592 -0.372778 -0.379963 -0.174554 -0.340884 -0.480085 -0.065843 -0.239782 -0.242745 -0.113692 1.209302 -1.173557 0.377928 1.301265 -1.334408
26 -0.433925 0.077585 -0.246596 -0.352384 -0.203506 -0.400457 -0.084117 0.226072 -0.106195 -0.492032 -0.035009 -0.222992 -0.239290 -0.048429 -0.130952 -0.349268 -0.061895 -0.761696 -0.416471 -0.385213 -0.166308 -1.156597 -0.800828 0.561643 -0.348988 -1.166312 -0.035535 -0.181093 -0.357791 0.185550 -0.483955 -0.032845 -0.028466 -0.114165 0.093247 -0.278514 0.208279 15.311691 0.081677 -0.731937 -0.420340 -0.304459 -0.363563 0.142489 -0.207420 -1.064387 0.203223 -0.054461 -0.239782 -0.509262 1.075227 -0.601701 -1.891746 0.922604 -0.820469 0.749396
27 -0.263429 -0.086536 0.552248 -0.273125 -0.203506 -0.569350 0.009025 -1.739376 -0.106195 -0.113400 1.686546 0.435559 -0.789562 -0.120099 -0.130952 -0.158511 0.140156 -0.603675 -0.718074 -0.077263 -0.166308 -0.287141 -0.078155 -0.578386 -1.327523 -0.646763 -0.505435 -0.181093 -0.187557 -0.290345 -0.177060 0.491764 -0.104169 -0.114165 0.432708 0.739259 -0.903037 0.071915 0.295217 -1.431666 0.884020 -0.175924 1.321125 0.358527 -0.240852 0.687933 0.037038 -0.055375 -0.116128 0.831349 -0.396663 -0.554581 -0.097018 0.323237 -0.804187 0.749396
28 -0.468024 0.621288 -0.246596 -0.418987 -0.203506 -1.039601 -0.084117 0.224568 -0.106195 0.047660 -0.440081 -0.156228 0.028144 -0.140105 -0.130952 -0.296134 0.791032 -0.377422 -0.468221 -1.001112 -0.166308 -0.062962 -0.916631 0.120688 -1.327523 1.969008 -0.285732 -0.181093 0.444742 -0.236677 0.630229 1.260773 -0.081846 -0.114165 -0.858972 0.058312 0.222474 -0.185195 0.119360 -1.112345 -0.420340 -0.304459 -0.349468 0.295387 0.052197 -0.181750 0.209312 -0.065843 0.250415 -1.333774 0.738218 -0.319562 -1.477894 -0.248575 -0.814537 0.749396
29 -0.297528 -1.040557 -0.246596 -0.457084 -0.203506 -1.009796 -0.084117 1.244135 -0.038115 -0.111388 -0.338813 -0.466551 -1.008579 -0.219330 -0.130952 -0.211448 0.224856 -0.299824 -0.465751 -0.189245 0.036542 0.241414 -0.165180 0.099178 0.949661 -0.906780 -0.793500 -0.181093 -0.041642 -0.124682 -0.716828 -0.907493 -0.212870 -0.114165 -0.823174 -0.698026 3.014960 3.923277 -0.194669 0.375159 -0.368468 -0.088468 -0.253070 0.157965 -0.201126 0.158721 -0.225129 -0.065843 0.215086 0.304553 -0.021573 -0.346999 0.120525 0.428563 -0.649911 0.749396
30 -0.382776 -1.594110 -0.246596 -0.465895 -0.203506 -0.208383 -0.030192 -1.739376 -0.106195 -0.078861 -0.237545 0.010329 0.811600 -0.321829 0.250764 -0.191638 0.064723 -0.378462 -0.796007 -1.001112 -0.120380 -0.775466 -0.346827 -2.094840 0.740879 -0.103373 1.863983 1.611789 -1.938539 -0.986462 -1.356818 -0.779474 0.196700 -0.114165 -0.702449 -0.698026 -0.531923 -0.162419 1.501091 -1.540588 -0.082937 0.372409 -0.440592 0.612325 -0.253483 -0.379742 0.012412 -0.065843 -0.160290 -0.232940 -0.396663 -0.344805 -0.859163 -0.992084 -0.811783 0.749396
31 -0.672620 1.230345 -0.246596 -0.511766 -0.203506 -0.314355 0.021281 0.244117 -0.066905 -0.617603 0.268795 -0.061223 -1.053700 -0.220508 -0.130952 -0.347497 -0.305379 -0.512996 -0.559615 -0.609176 -0.166308 0.883779 -0.071939 0.077668 -1.327523 -1.012221 -0.656775 -0.181093 -0.430749 0.304238 0.881214 -0.221741 -0.217917 -0.114165 0.271248 -0.028022 -0.686047 -0.791418 -0.427051 1.102205 -0.420340 -0.304459 -0.462868 -0.379963 11.251696 -0.191002 -0.480085 -0.041656 -0.239782 0.406168 0.566064 -0.514352 0.681517 0.316103 1.301265 -1.334408
32 -0.127032 0.154516 -0.246596 -0.236367 -0.203506 -0.556104 -0.084117 -0.237094 0.531013 -0.346859 0.876402 -0.464698 -0.680984 -0.038133 0.255883 -0.365143 -1.940026 -0.675773 -0.017051 -0.357217 -0.166308 -1.334988 -0.297789 -0.632161 -0.244464 -0.789806 -0.806933 -0.181093 -1.208963 -0.672756 0.309160 -0.792007 0.074411 -0.114165 -0.870822 -0.462126 -1.156531 -0.397564 0.182166 -0.548926 0.113472 -0.099701 -0.325368 -0.148450 -0.394438 -0.777576 -0.151984 20.688705 -0.239782 -0.453998 -0.396663 -0.764028 -0.547376 -0.775766 -0.823176 0.749396
33 -0.485074 -0.331026 0.166369 -0.376572 -0.203506 -0.599155 -0.084117 -0.990491 -0.106195 -0.332065 0.775134 -0.445083 -0.201041 -0.052154 -0.130952 -0.217155 -0.572970 -0.539902 -0.255542 -0.805144 -0.166308 -0.025378 0.666849 0.002383 0.898807 -0.153187 -0.785843 -0.181093 -0.333472 -0.345623 0.584628 -0.484045 -0.217723 -0.114165 -0.317069 -0.329585 0.660512 -0.090704 -0.182108 1.102205 0.032507 -0.304459 -0.193536 -0.203543 -0.285965 -0.292774 -0.458414 -0.065843 -0.116128 -0.442410 0.551622 -0.644814 0.479992 -0.449716 0.320871 0.749396
34 0.051989 -1.176496 -0.246596 -0.393849 0.319982 -1.456866 -0.084117 -1.739376 -0.106195 0.371299 -0.946420 0.145557 1.435385 -0.101287 -0.130952 -0.085370 1.598988 0.002690 -0.075717 0.230686 -0.024696 -0.528260 -1.506004 3.648325 -1.327523 -1.405008 -0.974447 0.271786 -0.479387 0.032107 -0.514594 2.646601 0.007444 -0.114165 -1.158685 -0.698026 0.297509 -0.292637 -0.106741 -1.537362 -0.420340 -0.294616 -0.150365 -0.065501 -0.332972 0.362264 0.254426 -0.056372 -0.151458 0.289400 -0.061456 -0.691318 -0.006127 1.524139 -0.826098 0.749396
35 -0.689669 0.028367 -0.246596 -0.371734 -0.203506 -0.741555 -0.048576 -0.772442 -0.073151 -0.295360 1.787814 -0.680258 -0.443547 -0.321829 -0.130952 -0.187702 -0.487761 0.182562 0.301966 -0.861135 -0.166308 0.278203 -0.322193 0.341165 -1.327523 -0.753270 -0.273709 -0.181093 0.189391 -0.148784 1.037739 -0.263369 -0.149785 -0.114165 -0.825149 0.216389 0.419186 -0.468473 0.270095 1.102205 0.310034 0.121502 -0.403235 0.439000 -0.379385 -0.191002 -0.023066 -0.065843 -0.239782 -0.634053 0.734616 -0.618465 -1.268732 0.482485 -0.787047 0.749396
36 -0.357202 0.195736 -0.246596 -0.356444 -0.203506 -0.102411 -0.037546 0.596003 -0.106195 0.345397 0.572598 2.062345 -0.426028 -0.173059 -0.130952 -0.380755 0.291970 0.481657 -0.307168 0.118704 -0.166308 0.196948 -0.856312 -0.223472 -1.327523 -0.847663 -1.076420 -0.181093 -0.114599 -0.087449 0.044789 0.081745 -0.217917 -0.114165 -0.571356 -0.047478 -1.266040 0.061193 -0.427051 1.102205 -0.157744 1.036775 -0.089303 -0.379963 -0.242332 0.086556 -0.480085 -0.053212 -0.213284 0.595137 1.882250 -0.284766 -0.248255 1.158364 1.301265 -1.334408
37 -0.408350 -0.553438 -0.246596 -0.371734 -0.203506 -0.072606 0.009025 0.695253 -0.106195 -0.074020 -0.136277 -0.280379 -0.795111 -0.066183 -0.130952 -0.376819 0.283165 -0.621811 -0.531950 0.538635 -0.166308 1.327373 0.074023 1.239207 -1.327523 -0.049102 -0.620585 0.172625 -0.114599 -0.986462 -0.684184 1.260773 -0.217917 -0.114165 -0.430881 -0.446318 -0.621152 0.017250 -0.427051 -1.004938 -0.063696 -0.274773 1.025082 -0.379963 -0.320342 0.493642 -0.480085 -0.047395 -0.239782 0.683382 -0.396663 0.849511 0.082157 -0.623232 1.301265 -1.334408
38 -0.425400 -0.678887 0.147087 -0.180605 -0.203506 0.205571 -0.049801 2.799049 -0.015483 0.017960 -0.541349 0.099709 -0.820066 -0.192479 -0.130952 2.015256 -0.286243 1.091292 0.154993 -0.217240 0.055679 0.577815 0.329801 0.142198 0.628415 0.227874 -0.151358 -0.181093 -0.455068 -0.986462 -0.703871 -0.706959 -0.217917 -0.114165 0.344078 -0.396463 -0.304792 -0.367757 -0.427051 -0.625061 -0.420340 -0.304459 1.102456 -0.379963 -0.124513 0.878523 -0.480085 -0.055564 0.140011 -0.397842 -0.396663 1.129521 -0.379004 0.747270 1.301265 -1.334408
39 -0.195230 -0.284100 -0.246596 -0.402617 -0.100179 0.457254 0.081946 -0.798006 -0.018334 -0.224023 -0.338813 0.073718 1.397531 0.072930 0.272869 0.258820 0.078593 -0.230105 1.558398 -0.497194 -0.166308 -0.156392 0.259468 0.765987 1.010251 3.049004 0.427835 -0.181093 -0.552345 0.282043 0.767032 -0.323798 -0.117757 -0.114165 -0.518523 0.727707 -0.568426 -0.286369 0.094238 1.102205 1.040177 -0.042371 -0.437339 0.270007 -0.021208 0.528800 0.115675 -0.058652 -0.045469 -1.177785 -0.036015 -0.397422 0.531025 0.061110 -0.809243 0.749396
40 7.579400 1.905037 -0.246596 8.237573 5.375898 0.510240 -0.084117 1.937380 -0.062152 -0.055835 2.294153 -0.282321 0.597921 -0.157376 -0.130952 -0.188227 -1.355932 0.024245 -0.522564 6.417670 -0.166308 0.369781 2.102410 -0.718201 -1.327523 -1.974855 -0.365123 17.668369 -1.476474 5.697848 -0.667146 -1.802730 -0.175989 -0.114165 -1.113012 0.866328 1.124912 -0.274624 -0.420771 -0.634082 -0.420340 0.293597 -0.108376 0.023019 -0.281843 -0.944111 10.166463 -0.059523 0.351988 1.451738 -0.242330 0.016421 0.796249 0.622291 -0.831222 0.749396
41 -0.485074 -0.387487 -0.246596 -0.410003 -0.203506 -0.991582 -0.068185 -1.739376 -0.091484 0.276020 -1.554028 0.027120 2.183001 -0.119034 -0.130952 -0.437169 0.213353 -0.172203 2.233237 -0.245236 -0.166308 0.392940 0.097160 1.185432 0.630852 -0.686110 -0.601335 -0.181093 2.001171 -0.887767 1.600165 -0.233379 -0.178901 -0.114165 -0.713806 -0.052342 -1.389745 -0.274422 -0.207230 -1.123795 -0.063619 -0.304459 -0.376967 -0.030836 -0.488763 -0.126239 0.004055 -0.047260 -0.014556 -0.965641 -0.396663 0.089341 -1.182125 -0.835983 -0.807214 0.749396
42 -0.429662 -1.159860 0.107021 1.482799 -0.203506 -0.907136 -0.084117 -0.050625 -0.106195 0.421905 -0.136277 0.135606 -0.868792 -0.230851 -0.130952 -0.287016 0.344550 -0.411910 -0.439567 0.034718 0.009750 0.240090 -0.673284 0.013138 -1.327523 -0.370465 1.731771 -0.181093 0.858169 -0.542843 -0.396511 -0.521198 -0.217917 -0.114165 2.460339 -0.698026 -0.446749 0.026812 -0.427051 1.102205 -0.420340 -0.299036 -0.404369 -0.379963 -0.140145 -0.017066 -0.480085 -0.065843 -0.239782 1.045276 -0.396663 0.039991 -0.719101 -0.159544 1.301265 -1.334408
43 0.051989 1.303219 -0.246596 2.090740 0.764120 0.235375 20.624071 0.908791 2.616956 11.064471 1.180206 -0.151595 2.648425 -0.321829 2.177288 -0.185603 0.302786 0.803198 0.795004 2.386332 -0.166308 -1.420213 -1.287982 2.422256 1.615611 -2.211225 -0.675837 -0.181093 -0.552345 2.953496 0.208079 -0.512693 -0.217917 -0.114165 0.082384 0.204837 -0.809752 -0.085094 -0.427051 1.102205 -0.420340 0.139620 -0.394364 -0.379963 -0.385603 8.629809 -0.480085 -0.040935 -0.239782 0.684274 -0.008072 -0.124041 4.979105 0.581797 1.301265 -1.334408
44 0.034940 -0.091923 -0.246596 0.417228 -0.203506 0.013496 -0.084117 0.442617 -0.106195 1.042949 2.091617 0.662195 -0.042496 -0.158573 -0.130952 0.156225 0.005242 -0.176291 0.399413 0.594626 -0.139517 0.789555 0.594214 -0.814996 -1.327523 0.059828 0.244053 -0.181093 -1.087367 0.125317 0.015366 -1.285283 0.512127 -0.114165 -0.544693 1.054805 2.230144 -0.029903 2.116590 -0.217439 0.354364 -0.304459 -0.163425 1.364434 -0.360335 0.245690 0.392227 -0.054916 0.581629 0.476586 -0.208792 1.057988 0.465092 -0.002673 -0.816986 0.749396
45 -0.127032 -0.692009 -0.117994 -0.419419 -0.203506 1.094742 -0.013647 -0.707779 -0.087093 0.077167 -1.047688 -0.351600 0.667499 -0.095411 0.106774 -0.437169 -0.058002 -0.202306 0.066810 -0.329222 -0.166308 -0.979795 0.033043 -0.363287 1.272834 -0.334510 -0.477520 -0.094410 0.955445 -0.484939 0.100555 -1.516254 -0.114845 -0.114165 0.611449 -0.472462 2.540420 0.038980 -0.182108 -0.607269 -0.159437 -0.167508 0.093684 0.015590 -0.576678 -0.574958 0.320373 -0.065843 -0.239782 1.408061 0.085035 -0.640420 0.267292 -1.080800 -0.821754 0.749396
46 0.034940 -0.506971 -0.246596 0.283373 -0.203506 0.129403 -0.084117 0.170432 -0.106195 0.314712 -0.440081 -0.340238 -0.209547 -0.163704 -0.130952 -0.265828 -0.407618 0.481954 -0.556898 0.734603 -0.166308 0.303347 -0.499695 0.443338 0.637306 -0.333444 0.426962 -0.181093 -0.041642 -0.132179 -0.307779 -0.911074 -0.217917 -0.114165 1.388630 -0.561836 -0.262205 -0.223855 -0.427051 1.102205 -0.345533 0.040648 -0.309450 -0.379963 -0.510409 -0.735017 -0.480085 -0.065843 -0.195620 -0.491435 -0.396663 0.540534 -0.056788 -0.672748 1.301265 -1.334408
47 -0.664095 -1.207828 -0.246596 -0.040487 -0.203506 -0.724996 -0.084117 0.675704 -0.106195 0.004546 -0.541349 -0.373200 -0.918392 -0.321829 -0.130952 -0.310238 -0.796809 -0.739695 -0.412643 0.062714 -0.128034 1.302759 -0.546891 -0.051392 0.730098 0.980498 -0.709474 -0.181093 -0.065961 -0.420004 -0.678815 1.452354 -0.217917 -0.114165 -0.498032 -0.698026 -1.101776 -0.340511 -0.427051 1.102205 -0.139889 0.032337 -0.420484 -0.379963 -0.288372 0.073603 -0.480085 -0.065843 0.087016 -1.129652 -0.396663 0.737421 -0.234100 -0.529235 1.301265 -1.334408
48 -0.050308 0.557086 -0.246596 -0.344436 -0.203506 -0.476624 -0.084117 0.341864 -0.031326 0.845949 -1.148956 -0.156427 -0.959880 -0.060515 -0.130952 0.197355 0.338112 -0.330298 -0.416842 -0.301227 -0.166308 0.611693 0.099577 -0.341777 0.627060 0.371887 -0.631184 0.012903 0.031316 0.284415 0.408309 -0.670255 -0.013908 -0.114165 -0.638013 -0.109493 0.404990 0.218843 0.646930 1.102205 0.122861 -0.304459 0.672217 0.248961 -0.080124 0.480689 -0.083924 -0.047463 0.215086 0.552352 -0.126381 0.218125 -0.483306 -0.672398 -0.809028 0.749396
49 -0.412613 -0.717299 -0.246596 -0.261851 -0.203506 -0.132215 -0.084117 -0.104761 -0.106195 -0.060725 0.167527 -0.444510 -0.777986 0.160947 -0.130952 -0.354975 0.761635 -0.732857 0.482038 -0.511192 -0.166308 0.093725 -1.506004 0.567020 0.775188 -0.170825 1.392807 -0.181093 0.846009 0.562666 -0.057975 -0.407950 -0.217917 -0.114165 -0.743431 -0.329585 -0.677935 -0.429365 -0.427051 1.102205 -0.337375 0.010258 -0.309992 -0.379963 -0.358565 0.919232 -0.480085 -0.065843 0.290161 -0.154500 0.067931 0.460639 0.381651 -0.612252 1.301265 -1.334408
50 -0.127032 0.723452 -0.221461 -0.192786 -0.004272 0.510240 -0.084117 2.546413 -0.106195 0.202798 0.775134 -0.099084 -0.659916 0.034491 -0.130952 -0.293051 -0.138477 -0.012622 1.249508 0.314672 -0.131862 0.128133 0.513520 -0.212717 0.715300 0.207523 0.058068 -0.181093 2.633470 0.810928 0.456057 1.074563 -0.217917 -0.114165 -0.320031 0.288131 0.252894 0.166027 -0.427051 1.102205 -0.151741 0.829182 0.224087 -0.379963 0.426668 0.214233 -0.480085 -0.049089 0.144427 0.010403 2.692142 1.614045 0.656932 0.512628 1.301265 -1.334408
51 -0.314577 -1.108256 -0.246596 -0.322537 0.174981 0.586408 0.238204 -0.289726 -0.106195 -0.320607 -2.060368 -0.558821 0.038480 -0.148931 -0.130952 -0.408962 1.263420 -0.250916 0.257504 1.602461 -0.020868 1.237913 -0.538603 2.992270 0.792473 -0.970742 -0.192545 -0.181093 -0.357791 0.727186 0.212661 -0.130426 -0.217917 -0.114165 0.573430 -0.429295 -0.004656 -0.254932 -0.427051 -0.967535 0.028659 0.574965 -0.454046 -0.379963 -0.458294 0.101359 -0.480085 -0.065843 0.020774 -0.431714 0.444185 0.523977 0.016596 1.155357 1.301265 -1.334408
52 -0.306053 0.762883 -0.246596 -0.440411 -0.203506 -0.619024 -0.084117 0.235095 -0.106195 -0.241140 -0.237545 -0.048337 0.417365 0.442723 -0.130952 -0.198525 0.344349 0.193414 -0.588640 -0.413208 -0.166308 0.534144 -0.143539 0.507868 0.375203 1.024303 0.394158 -0.181093 3.508961 -0.637493 0.721861 1.809553 -0.179871 -0.114165 -0.041303 0.590910 -0.262205 -0.354053 -0.131863 -0.857907 -0.125574 -0.163951 -0.117099 -0.205400 0.294297 -0.353837 -0.296192 -0.049157 -0.239782 0.958814 -0.016951 -0.750370 0.080667 -0.820386 -0.797931 0.749396
53 0.367408 -0.851291 0.086012 0.397230 -0.203506 -0.003062 0.073980 -0.942369 -0.106195 -0.153128 -0.642617 2.315413 0.685638 -0.137000 -0.130952 0.611668 -0.738654 2.187030 -0.339897 0.426654 -0.059142 0.249089 -0.528473 -0.352532 0.647205 0.279238 -0.310382 -0.181093 -1.160325 -0.487227 0.318037 -0.206522 -0.121639 -0.114165 -0.576046 -0.150836 -0.394022 -0.123185 -0.156986 0.394554 -0.191146 -0.004850 -0.338625 -0.079120 0.191180 -0.044822 0.061887 -0.048005 -0.239782 -0.863134 -0.396663 -0.653248 -1.211925 -0.707087 -0.819846 0.749396
54 0.256585 -0.010711 0.110247 -0.412120 -0.203506 0.997049 -0.084117 0.489235 -0.106195 -0.494678 1.281474 -0.299442 -0.709347 -0.145118 -0.130952 -0.290034 -0.749139 -0.717396 0.012961 0.650617 0.136053 0.219710 0.870482 -0.363287 0.725248 0.512798 0.203471 0.338337 -0.187557 -0.800742 -0.557403 -0.539550 -0.115816 -0.114165 0.047080 -0.556364 0.376599 -0.152058 -0.069057 1.102205 -0.096174 0.073275 5.750181 0.159822 -0.205263 0.112462 0.253816 -0.061918 -0.107296 0.366948 -0.396663 -0.699356 -0.073178 -0.848292 -0.816755 0.749396
55 -0.416875 -0.198690 -0.246596 -0.267337 -0.203506 -0.592531 0.136483 1.041124 -0.027795 -0.059187 -0.237545 -0.459535 -0.731738 -0.321829 0.851680 -0.407912 -0.301793 -0.366421 -0.085844 -0.189245 -0.166308 0.037879 1.960938 -1.707660 1.928693 0.629868 -0.473772 1.442084 -1.549432 -0.688598 -1.264399 0.121135 -0.217917 -0.114165 0.516894 0.390273 0.244782 0.042667 -0.427051 1.102205 1.671193 0.347653 -0.377460 -0.379963 -0.398616 -0.168798 -0.480085 -0.053967 0.290161 1.423215 -0.396663 5.969436 0.205828 -1.191756 1.301265 -1.334408
56 1.654654 2.695364 0.729319 1.167490 0.367079 1.367951 -0.084117 1.364438 0.245883 -0.227573 -0.845152 0.303730 0.204123 0.140588 -0.130952 -0.431265 1.165028 3.204352 -0.053239 0.398658 0.070988 -1.049404 1.478734 -0.460081 0.695267 -2.002669 -1.129320 0.469609 -1.354878 7.282662 0.744697 1.188259 0.509215 -0.114165 -0.584440 0.897944 -1.012547 0.554387 0.006310 -0.455455 0.740331 0.015152 2.091860 -0.210352 -0.009255 -0.884899 0.084546 -0.062252 -0.067550 -1.052103 -0.396663 -0.511855 1.460425 -0.113280 -0.829842 0.749396
57 0.819223 0.514824 1.779814 -0.457947 -0.203506 -1.079340 0.065401 -0.313787 0.051873 -0.362705 0.775134 -0.525703 -0.900451 -0.016174 0.386947 -0.384560 -0.793779 0.977571 -0.430675 0.342668 -0.166308 2.942682 -0.083220 -0.460081 0.789986 -1.000979 0.064624 -0.181093 -0.455068 -0.131629 -0.911259 -0.044484 -0.215782 -0.114165 -0.057103 5.594654 -0.872618 -0.450649 -0.420771 1.102205 1.176170 -0.013865 4.594792 -0.379963 0.174316 -0.305727 -0.298227 -0.059465 0.391733 -0.367536 -0.396663 0.312447 -0.177479 -0.666383 -0.813310 0.749396
58 0.580528 1.650234 -0.246596 0.088098 0.814356 0.039989 -0.084117 -0.388976 -0.065728 -0.169582 -0.136277 -0.587768 0.550814 -0.183869 -0.130952 -0.437169 -1.345931 0.024393 -0.417706 -0.021273 -0.166308 -1.523436 -0.719789 0.023893 -1.327523 -1.414603 -0.999406 0.157318 0.055635 1.562153 -0.755771 -0.005988 -0.217723 -0.114165 -0.031181 0.103303 -0.750941 -0.020498 -0.395648 1.102205 -0.420340 -0.204624 -0.336457 -0.181877 0.610261 -1.615803 -0.213995 -0.059184 0.334323 -0.404973 -0.396663 -0.771181 -1.409725 0.301206 -0.659522 0.749396
59 -0.399826 -0.021186 -0.246596 -0.340635 -0.203506 -0.463378 -0.084117 -0.109272 -0.106195 0.031248 0.167527 -0.258647 0.371004 -0.140737 -0.130952 -0.075071 0.102617 -0.142547 -0.865418 0.006723 -0.166308 0.069110 -0.083451 0.712212 0.224743 1.258638 -0.492337 -0.181093 0.225869 -0.577450 0.170209 0.124716 -0.007114 -0.114165 -0.681711 -0.480366 0.492192 -0.048029 0.232411 1.102205 -0.420340 -0.218534 -0.203689 0.143108 2.353612 -0.272420 0.074945 -0.065843 0.069352 -0.224026 -0.396663 -0.561946 -0.326109 -0.874378 -0.818512 0.749396
60 -0.271953 -0.887306 -0.246596 -0.173047 -0.203506 -0.410392 -0.066959 0.520814 -0.073423 -0.133017 -0.642617 -0.275261 -0.747511 -0.146109 -0.130952 0.053238 -0.378197 -0.432870 -0.406220 -0.049268 -0.105070 -0.133895 0.791631 -0.728956 0.727375 0.194924 -0.475142 -0.181093 0.858169 -0.434236 0.065048 1.164982 -0.217917 -0.114165 -0.143512 -0.137460 0.634148 -0.153565 -0.427051 1.102205 -0.420340 -0.304459 -0.321278 -0.379963 -0.188544 0.182777 -0.480085 -0.052283 -0.239782 1.826111 -0.396663 1.286069 1.001871 -0.591410 1.301265 -1.334408
61 -0.391301 -0.621230 0.119626 -0.258785 -0.203506 -0.824345 -0.084117 0.256148 -0.106195 0.091323 -0.035009 -0.566499 -0.397750 -0.015270 -0.130952 -0.232767 -0.082842 -0.522212 -0.051263 -0.329222 -0.062969 0.428274 -0.491407 0.583153 -1.327523 -0.600536 -0.526014 -0.181093 2.147086 -0.468123 -0.474648 -0.472407 -0.217917 -0.114165 1.072869 -0.698026 0.982955 0.260035 -0.427051 1.102205 -0.420340 0.009236 -0.287815 -0.379963 -0.379879 -0.357538 -0.480085 -0.060849 -0.239782 -0.618009 -0.396663 0.643419 0.400648 -0.931588 1.301265 -1.334408
62 -0.271953 -0.917308 -0.246596 -0.152573 -0.203506 -0.612401 -0.057767 -0.020549 -0.106195 -0.400430 0.268795 0.004593 -0.968076 -0.183929 -0.130952 0.232122 -0.456660 0.049813 -0.223060 -0.609176 -0.166308 0.370310 -0.864485 -0.513856 0.770307 1.080997 -0.751426 -0.181093 -0.479387 -0.379382 0.641969 -1.692616 0.397603 -0.114165 -0.882178 0.373250 -1.122056 -0.570613 0.257534 -1.357948 0.270937 -0.011470 -0.260709 0.246485 -0.078717 -0.914505 0.618998 -0.046558 -0.151458 -1.385473 -0.396663 0.015600 -1.648873 -0.206612 -0.827538 0.749396
63 0.239535 0.161583 0.345241 -0.144323 -0.203506 1.063282 -0.026516 0.711795 -0.075777 0.232468 1.889081 -0.194442 1.948226 -0.124231 -0.130952 -0.322308 -0.918539 -0.106572 1.486517 1.602461 0.021233 0.314728 0.496829 1.271472 -1.327523 -0.157742 -0.397309 -0.181093 0.104274 -0.360152 -0.309139 0.526678 -0.127462 -0.114165 1.186681 -0.328977 1.487915 0.046269 0.144483 1.102205 -0.095559 -0.167666 -0.039675 0.120824 -0.121347 0.365965 -0.081637 -0.058187 -0.147042 0.737756 -0.284269 -0.494075 1.904823 -0.900046 -0.804311 0.749396
64 0.452656 0.364358 -0.246596 -0.166438 -0.203506 -0.248122 -0.084117 0.323818 -0.106195 0.276605 0.471331 0.030915 -0.857977 -0.034765 -0.130952 0.436917 -0.556733 1.106455 -0.250355 0.006723 -0.097416 -1.430800 -0.016916 -2.051820 -1.327523 0.502137 -0.053173 -0.181093 -1.281921 -0.144548 -0.519247 -0.723969 -0.217917 -0.035059 0.986214 -0.152052 0.415130 0.122426 -0.427051 1.102205 -0.420340 -0.032458 1.397957 -0.379963 -0.488909 0.223485 -0.480085 -0.047966 -0.133793 1.130847 -0.383879 0.487272 -0.649070 0.255467 1.301265 -1.334408
65 -0.220805 -0.491965 -0.246596 -0.354889 -0.203506 -0.622336 -0.084117 1.331354 -0.026618 0.236646 -0.338813 0.068114 -0.087420 0.103937 -0.130952 -0.222862 -0.408328 -0.453533 -0.035577 0.062714 -0.166308 -0.185506 0.378264 0.669192 0.595418 0.234658 0.479768 -0.181093 2.414597 -0.640119 0.708259 1.823877 0.637327 -0.114165 0.656135 -0.158132 -0.099969 0.678157 1.061450 -1.057502 -0.159283 -0.304459 -0.195803 0.534329 -0.173453 -0.209506 0.377097 -0.064478 0.321074 -0.108149 0.032703 -0.628126 -0.091058 -1.346109 -0.824752 0.749396
66 -0.229329 -1.070469 -0.246596 -0.219047 -0.203506 0.036678 -0.084117 -0.056640 -0.106195 0.437585 -0.440081 -0.454438 0.547400 0.241411 -0.130952 -0.435135 -0.042605 -0.489359 0.216376 -0.861135 -0.135689 -0.112986 0.213538 1.142412 0.641994 -0.357866 -0.779542 -0.181093 0.274508 -0.013068 0.476102 -0.821550 -0.005173 -0.114165 -0.686155 0.140998 -0.154724 -0.100771 0.043993 0.517501 -0.420340 -0.290162 -0.186440 -0.048788 -0.235498 -0.353837 0.026811 -0.041578 0.263664 0.790346 -0.396663 -0.250541 -0.732139 -0.540635 -0.822572 0.749396
67 -0.629996 0.420551 -0.246596 -0.402185 -0.203506 -0.393834 -0.084117 0.654651 -0.106195 0.143989 0.673866 -0.151705 -0.557421 0.057642 -0.130952 -0.259924 0.376823 -0.288377 0.047296 -0.301227 -0.166308 0.641866 0.039604 0.195973 -0.582901 -0.571462 -0.390620 -0.181093 2.074129 -0.454590 0.592073 -0.057017 -0.137168 -0.114165 -0.681218 -0.558188 0.845055 5.779754 0.169605 -0.897021 -0.125574 -0.304459 -0.320539 0.000115 -0.336600 -0.135491 -0.206080 -0.065843 -0.222117 0.448954 -0.396663 -0.528713 -1.182870 -0.955157 -0.799579 0.749396
68 1.066442 -0.232945 0.047297 0.091078 -0.203506 2.106444 -0.084117 2.660701 -0.056041 1.352516 -0.338813 0.389248 -0.275257 -0.089230 -0.130952 -0.277833 2.665979 -0.220442 0.350628 0.538635 0.537926 -0.447534 1.517296 0.185218 0.688477 -0.266962 0.873164 -0.181093 0.152912 -0.246293 -0.614673 1.805077 -0.217917 -0.114165 0.805745 0.880312 0.970788 1.049219 -0.427051 -0.882958 -0.420340 -0.244558 0.172439 -0.379963 -0.217880 1.709349 -0.480085 -0.065843 0.369652 1.112129 0.147345 2.922227 0.295230 0.213994 1.301265 -1.334408
69 0.205436 0.408475 -0.246596 0.046590 0.198958 0.288361 -0.084117 -0.868684 0.022088 0.068631 1.889081 0.096995 -0.007514 -0.198767 -0.130952 -0.009408 -0.783625 -1.250474 1.881120 1.406493 -0.166308 -2.397656 0.184530 0.142198 0.859449 -0.313093 0.270974 0.208895 1.636383 0.809912 0.755936 -0.010465 -0.217917 -0.114165 -1.400381 1.288880 -1.389745 0.339036 -0.427051 -0.635192 -0.420340 0.038887 0.409736 -0.379963 0.370836 -2.200526 -0.480085 -0.054006 -0.155874 -0.366644 -0.396663 1.307203 -0.371182 2.240442 1.301265 -1.334408
70 3.393716 1.072103 -0.246596 7.598404 4.381156 0.612901 -0.084117 0.274193 -0.106195 -0.132177 0.370063 -0.251101 0.452629 -0.199768 -0.130952 -0.437169 -0.851934 0.015177 -0.618405 0.594626 -0.166308 -0.904363 -0.079076 -0.782731 0.674780 -1.911474 -0.850591 1.415464 0.055635 3.198941 -0.437709 -0.591474 0.188936 -0.114165 -0.606413 -0.142324 -0.542062 0.005476 -0.420771 1.102205 -0.420340 0.031950 -0.213742 -0.379963 1.263957 -0.115136 1.853031 -0.065843 -0.111712 0.088843 -0.396663 -0.198827 -0.446428 2.835613 -0.832396 0.749396
71 3.214695 2.400907 -0.246596 0.802337 0.360514 4.144750 1.121216 1.460680 0.191337 -0.338997 0.167527 7.517015 2.648425 -0.163244 0.475094 -0.437169 0.363995 0.098944 -0.497739 0.706608 -0.166308 -0.781024 0.124557 0.206728 -1.327523 0.137552 -0.539206 0.388250 -0.406430 1.242475 -0.852772 2.260753 -0.166284 -0.114165 -0.480380 1.808710 -1.178838 -0.271515 -0.295159 1.102205 -0.206846 -0.191806 -0.171507 -0.308776 2.165836 1.562243 -0.244050 -0.012078 0.873098 2.304773 1.356540 -0.767239 1.323715 2.278034 -0.820785 0.749396
72 -0.246379 -0.002716 -0.246596 -0.207903 -0.203506 0.122780 -0.084117 -0.709283 -0.106195 -0.316371 0.977670 -0.045028 0.087742 -0.215959 -0.130952 -0.155821 -0.746287 -0.463493 -0.686333 -0.021273 0.021233 -0.808021 0.102801 0.131443 0.253860 0.475389 -0.892477 -0.181093 0.006997 -0.580224 0.194120 -0.170712 -0.047877 -0.114165 -0.264977 -0.135636 -0.515699 -0.025002 0.207289 -0.743363 0.662137 -0.304459 -0.414472 0.084921 -0.191404 -0.135491 -0.019005 -0.048378 -0.239782 -0.109040 -0.157974 -0.327839 -0.644973 -0.454472 -0.815822 0.749396
73 0.512329 2.724464 -0.246596 0.669346 -0.203506 -0.019620 -0.081053 3.332893 -0.061654 -0.703351 -0.136277 1.667077 2.648425 -0.321829 -0.130952 1.039822 -0.542532 1.128307 -0.610994 -0.133254 0.216428 -0.667479 0.204675 0.841272 -1.327523 -0.128958 -0.891778 0.691723 -0.430749 1.344346 4.264140 -0.338121 -0.184724 -0.114165 0.230019 1.055413 -0.049271 0.352558 -0.295159 -1.037945 -0.256949 0.110884 -0.180969 -0.235112 1.978790 -0.179900 -0.146553 -0.051630 0.109097 0.923159 -0.181199 1.424178 -0.920626 1.403566 -0.814408 0.749396
74 -0.442450 -0.750526 -0.246596 -0.393979 -0.203506 -0.622336 -0.084117 -1.739376 -0.079805 -0.401312 -0.237545 -0.292360 -0.472192 -0.195775 -0.130952 -0.029678 -0.654131 -0.668340 -0.365834 -0.637172 -0.166308 0.306524 -0.175540 -1.417276 0.647236 0.475583 -0.829595 -0.181093 -1.768305 -0.986462 -2.110059 -0.731131 -0.188606 -0.114165 -1.045120 -0.698026 0.015624 -0.346329 -0.232353 -1.073162 -0.420340 -0.304459 -0.341977 -0.003599 1.400340 -0.642498 0.028264 -0.054669 0.029606 -0.339903 -0.396663 -0.222062 -0.821912 -1.195043 -0.802768 0.749396
75 -0.391301 0.387773 -0.193249 -0.364953 -0.203506 -0.244811 -0.084117 -0.303261 -0.106195 0.191216 -0.946420 -0.660423 -0.310515 0.455756 -0.130952 -0.319028 0.769908 2.054578 0.136344 0.006723 -0.166308 -0.894305 -0.292724 -0.449327 -1.327523 -0.371240 -0.796267 -0.181093 0.250189 -0.062542 -0.403562 -1.072216 -0.076994 -0.114165 -0.519017 -0.698026 0.021708 0.086310 -0.219792 -0.255929 -0.200074 -0.266532 0.667485 -0.232017 -0.650473 0.203131 -0.025770 -0.044734 -0.032221 -0.806978 -0.396663 -0.666860 -1.077078 -0.593019 -0.825818 0.749396
76 -0.033259 -0.615522 0.123602 -0.377695 -0.203506 0.785105 -0.071248 -0.937858 -0.089311 -0.374752 -0.136277 -0.320204 0.368588 0.029739 0.067346 -0.215187 -0.847023 -0.393923 1.765518 -0.049268 -0.166308 -0.741191 -0.475982 0.518623 -1.327523 0.642661 -0.759540 -0.181093 1.928213 -0.846849 0.661083 -0.225322 -0.113874 -0.114165 0.450236 -0.402543 -0.710382 -0.235867 -0.207230 -0.060294 -0.420340 -0.300603 0.263366 -0.010408 -0.467987 -0.536100 0.359799 -0.053386 -0.164706 0.575527 0.631578 -0.653276 0.458387 -1.252148 -0.822921 0.749396
77 -0.289003 0.394316 0.131930 0.875031 -0.203506 -0.738243 -0.084117 -0.083708 -0.106195 -0.313427 -0.541349 0.090552 -0.411438 0.184671 0.282798 -0.116201 -0.091470 -0.084125 -0.251343 0.062714 -0.166308 0.015911 -0.013462 0.604663 0.571462 -0.815488 -0.114645 -0.181093 -1.111686 0.032933 1.031619 0.704830 -0.012938 -0.114165 -0.703684 0.437697 0.830859 -0.211111 -0.006251 1.102205 -0.420340 -0.304459 -0.296390 -0.236970 -0.624125 -1.304937 -0.220868 -0.046602 -0.116128 -0.995947 0.081385 -0.604220 -0.234472 -0.069394 -0.823284 0.749396
78 0.768074 -0.492789 0.680325 -0.346207 -0.203506 -0.724996 -0.084117 0.796007 -0.106195 0.159835 1.180206 -0.454173 0.498539 -0.121443 -0.130952 -0.315092 0.034994 -0.942163 -0.336069 -0.357217 -0.166308 0.095049 0.157364 -0.858016 -1.327523 -0.315709 -0.587808 -0.181093 0.979765 -0.986462 -2.887210 -0.764255 -0.166284 -0.114165 1.160018 -0.698026 1.066101 -0.233505 0.207289 1.102205 -0.420340 -0.304459 2.682615 0.339338 -0.484385 0.167973 -0.096350 -0.054427 0.117930 -0.533329 -0.396663 -0.619240 2.065373 -1.139513 -0.778067 0.749396
79 0.657251 -0.828450 -0.246596 0.492514 -0.203506 -0.069294 -0.084117 0.227576 -0.089583 -0.438869 0.471331 0.737917 -0.789450 0.021971 -0.130952 -0.297315 -0.123080 0.213483 -0.699919 0.734603 -0.166308 -0.568755 0.140672 0.862782 0.800873 1.002013 -0.429832 0.836970 -0.552345 -0.025987 -0.589975 -0.223531 1.233242 -0.114165 0.197431 0.710075 -1.379605 0.221920 2.681843 -1.024252 -0.420340 -0.117590 -0.127399 2.603092 -0.117826 -0.063325 1.244583 -0.057060 -0.147042 -0.751713 -0.396663 -0.496712 -0.072433 -0.362434 -0.822973 0.749396
80 -0.391301 1.204553 -0.246596 -0.311869 -0.203506 -0.645517 -0.084117 -0.776953 -0.106195 -0.217064 0.370063 0.345981 0.310871 -0.175257 -0.130952 -0.149721 1.184602 -0.447439 -0.268017 0.426654 -0.166308 0.465328 1.054661 3.669835 1.417051 -1.202945 -0.536627 -0.181093 0.979765 0.508681 1.228197 -0.295150 0.508245 0.279337 0.115960 0.055272 0.240726 -0.453053 1.187062 1.102205 -0.420340 -0.000888 -0.271108 0.570851 1.880792 -0.575884 0.274790 -0.058211 -0.239782 0.736864 0.945665 -0.728618 1.302110 -0.384674 -0.822601 0.749396
81 -0.109982 -0.171788 -0.246596 -0.273297 -0.203506 -0.844215 -0.084117 0.257652 -0.046671 0.030169 0.268795 -0.101180 2.648425 -0.061580 -0.130952 0.365219 -0.171767 0.205753 0.681131 0.258681 -0.166308 0.111458 0.277541 -0.879526 -1.327523 0.217214 4.169752 -0.181093 -0.406430 -0.760226 0.122318 -1.166216 0.286961 -0.114165 -0.805152 0.143430 2.203781 0.060711 0.445951 -1.180223 -0.420340 -0.190644 -0.105517 0.512663 -0.374634 -0.500018 0.618846 -0.048223 -0.147042 0.867895 -0.396663 -0.143232 -0.392787 -0.317254 -0.824785 0.749396
82 -0.527698 0.756573 -0.246596 0.044171 -0.203506 -0.552792 -0.084117 -0.020549 -0.061427 0.135604 0.268795 0.022200 -0.892719 -0.005468 -0.130952 -0.095308 -0.131044 0.147703 -0.049163 -0.315224 -0.166308 0.285482 0.253827 1.475817 0.702765 0.482948 -0.419535 -0.181093 0.225869 0.372795 0.600556 -0.187722 -0.217917 -0.114165 -0.293985 1.221394 -0.614055 -0.174072 -0.427051 0.595034 -0.420340 0.553678 -0.346856 -0.379963 -0.392475 0.180926 -0.480085 -0.065843 -0.239782 -0.592159 1.326302 0.512784 -0.553337 0.261027 1.301265 -1.334408
83 -0.655570 0.142819 -0.246596 -0.394411 -0.203506 -0.883955 -0.084117 0.456151 -0.038749 0.016899 -2.060368 -0.297103 -0.785084 -0.146377 -0.130952 -0.287344 -0.193933 -0.429154 0.104109 -0.385213 -0.166308 0.368457 -0.042010 1.336002 0.615967 -0.319101 -0.799491 -0.181093 1.830937 -0.392556 1.531727 0.089802 -0.217917 -0.114165 -0.166225 -0.319857 -0.550174 -0.310030 -0.427051 -0.739535 0.135945 -0.304459 -0.303634 -0.379963 0.038356 0.787854 -0.480085 -0.065843 -0.036637 0.672686 -0.396663 0.852684 1.402314 -0.759261 1.301265 -1.334408
84 -0.612946 -0.927449 -0.246596 -0.354025 -0.203506 0.073105 -0.035095 -0.268673 -0.046082 0.001645 0.471331 -0.301737 0.133821 -0.121861 0.285141 0.045497 0.497689 0.304906 0.883929 -0.413208 -0.112725 1.005794 -0.094271 0.389563 0.674450 -0.710629 -0.709528 -0.181093 0.931126 0.646959 0.524996 0.976087 -0.217917 -0.114165 -0.790339 0.268676 -0.049271 0.000566 -0.427051 1.102205 0.549080 2.700015 -0.203689 -0.379963 -0.011957 0.136517 -0.480085 -0.040601 0.184172 -0.114388 1.110742 -0.005342 -0.466915 1.368947 1.301265 -1.334408
85 -0.416875 -1.213877 -0.246596 -0.260771 -0.203506 -0.218318 -0.084117 -0.304764 -0.106195 -0.126884 -0.136277 -0.299376 -1.010888 -0.091700 -0.130952 -0.434545 0.313934 -0.347542 1.116491 -0.049268 -0.166308 1.273909 0.352478 -0.384797 1.043472 0.601763 -0.262183 -0.181093 -0.406430 -0.721659 -0.738161 -0.860045 0.079264 -0.114165 -0.572343 -0.108277 0.114993 -0.525101 0.169605 -1.432903 0.019039 -0.295585 -0.244643 0.060160 0.667834 0.068052 0.140203 -0.065843 -0.102880 -0.252550 -0.129011 0.378527 0.373828 -0.548468 -0.823516 0.749396
86 -0.510648 -0.356962 -0.246596 -0.412163 -0.203506 -0.327601 -0.084117 -0.047617 -0.095603 -0.386925 -0.035009 -0.371877 -0.467291 -0.215865 -0.130952 -0.139619 -0.066783 -0.487873 -0.159701 -0.329222 -0.166308 0.643719 -0.127769 0.174463 0.716786 0.066999 -0.564044 -0.181093 -0.284834 -0.766601 0.088457 1.147973 -0.217917 -0.114165 0.492700 -0.276082 1.254701 -0.356719 -0.427051 1.102205 -0.170289 -0.063535 -0.413585 -0.379963 1.401375 -0.957064 -0.480085 -0.043505 -0.239782 -0.846198 -0.122906 -0.142423 0.730688 -0.963200 1.301265 -1.334408
87 -0.352939 0.505856 -0.246596 -0.464901 -0.203506 -0.539545 -0.084117 -0.170927 -0.106195 -0.507649 0.268795 -0.373686 -0.674914 0.017590 -0.130952 -0.171762 -0.532579 -0.431978 -0.125984 -0.581181 -0.166308 -0.507483 -0.471723 1.465062 0.838365 1.358943 -0.100553 -0.181093 1.113520 -0.369089 0.707329 0.472069 -0.217917 -0.114165 -0.202639 -0.187923 -0.218604 -0.248478 -0.427051 1.102205 -0.420340 -0.304459 -0.479919 -0.379963 -0.090614 -0.426927 -0.480085 -0.065843 -0.239782 -1.024025 -0.070558 -0.326215 -0.880768 -0.035404 1.301265 -1.334408
88 0.162812 -0.085187 -0.246596 0.630558 -0.203506 2.854872 -0.084117 0.308780 -0.106195 -0.164768 -0.035009 -0.069430 -1.022352 -0.179364 -0.130952 -0.125056 0.996953 0.902797 -0.239487 -0.469199 -0.166308 -0.813844 -1.020692 0.550888 0.627756 -1.499692 -0.839052 -0.181093 -0.479387 -0.040728 0.145583 -0.623702 -0.217917 -0.114165 -0.401502 -0.075446 -0.546118 -0.290090 -0.427051 1.102205 -0.420340 0.383572 2.704644 -0.379963 0.026570 -0.333482 -0.480085 -0.051654 -0.155874 -0.015447 1.691828 -0.067875 -1.239490 0.497802 1.301265 -1.334408
89 -0.629996 -0.066736 -0.246596 -0.448963 -0.203506 -0.228253 -0.076763 -0.053632 -0.106195 -0.565119 1.990349 -0.568661 0.018821 0.227576 -0.130952 0.973372 1.192058 -0.132884 -0.026561 -1.001112 -0.166308 -0.090224 -0.528243 0.475603 -1.327523 -1.049048 -0.094938 -0.181093 -0.065961 -0.498240 0.588995 0.984144 -0.217723 -0.114165 -1.326564 0.040072 1.820498 -0.337830 -0.094180 1.102205 -0.420340 0.555175 -0.403334 0.799269 -0.488834 -0.611041 -0.330779 -0.048547 -0.120545 0.153912 6.152705 -0.628913 -1.082293 0.108877 0.724563 0.749396
90 -0.391301 -1.594110 -0.246596 -0.455183 -0.203506 -0.513052 -0.084117 -0.304764 0.024668 0.009219 -0.541349 -0.213086 -0.879946 0.078429 -0.130952 -0.287410 -0.127328 -0.168413 -0.455005 -0.385213 -0.078279 0.613811 -1.506004 -0.836506 0.883120 0.195796 -0.415861 -0.181093 -1.160325 -0.517492 -0.567354 0.906259 -0.217723 -0.114165 -0.748616 -0.359984 1.479803 -0.279072 -0.332842 -1.519088 0.013651 -0.293595 -0.398061 -0.249350 -0.380416 0.710138 -0.422009 -0.064507 0.042854 -0.101018 -0.168447 0.537760 0.471052 -1.277150 -0.371171 0.749396
91 -0.493599 -0.750271 -0.246596 -0.377263 -0.203506 -0.522987 0.015766 -1.007032 -0.014622 0.129619 -0.338813 -0.063186 -0.879355 -0.283692 -0.021402 -0.105639 -1.086258 -0.468994 -0.570978 -0.693162 0.162845 -1.109750 0.317600 -0.019127 -1.327523 -0.790679 -0.348372 -0.181093 0.079954 -0.185868 -0.133177 0.268850 -0.217723 -0.114165 0.153980 -0.698026 -0.777304 -0.216276 -0.420771 1.102205 0.747874 -0.304459 -0.168599 -0.355821 0.088678 -0.194703 -0.292729 -0.047512 -0.239782 -0.001185 -0.396663 0.462596 0.796994 -0.450835 -0.774863 0.749396
92 0.213961 -0.377748 -0.246596 -0.258310 -0.203506 -0.016308 0.020055 -0.838608 -0.106195 0.060306 0.673866 0.060502 -0.948755 -0.054234 0.381444 -0.419917 -1.307043 -0.286445 -0.275303 0.398658 -0.166308 -0.634659 0.718189 -0.212717 0.661580 -0.363100 -0.215919 -0.181093 0.006997 -0.110704 -0.202116 0.155154 0.946931 -0.114165 0.723780 0.609149 -1.004435 -0.187678 1.048889 -0.419128 1.327786 -0.304459 -0.269777 0.396287 -0.543512 -0.035570 0.381415 -0.052060 -0.014556 -0.383580 -0.396663 -0.555720 0.299327 0.975687 -0.826867 0.749396
93 -0.007684 -0.477335 0.400463 -0.352341 0.177265 -0.539545 0.089912 -1.083725 -0.041012 -0.354253 -0.136277 -0.485062 0.071029 0.299364 0.231600 -0.400106 -0.159092 0.426208 -0.322853 0.090709 -0.089761 -0.243735 0.328650 0.604663 -1.327523 0.023388 -0.742439 -0.078272 1.660702 0.153061 0.460138 -0.935245 -0.038172 -0.114165 -0.382986 0.617661 -0.639404 -0.132948 0.081677 -0.208506 1.100131 0.017336 -0.104236 -0.005456 -0.537310 -0.272420 0.002818 -0.054161 0.126762 -0.611769 -0.396663 -0.578034 -0.477346 -0.059673 -0.819821 0.749396
94 1.688754 -0.962240 -0.246596 0.300348 0.837762 0.318166 -0.084117 -0.384465 0.032861 -0.383118 1.382742 -0.269899 -0.996017 -0.015465 0.414126 -0.400631 -0.333912 3.091003 -0.157478 0.510640 -0.166308 0.176833 -0.581655 -1.750680 0.663458 -0.720901 -0.409896 0.496396 -0.868494 0.087491 -1.266045 -0.036426 -0.047877 -0.114165 -0.491119 1.062709 -0.065494 0.459595 0.219850 -0.976071 0.310496 0.084737 -0.344539 0.207486 -0.020977 0.841516 0.087648 -0.048760 0.148843 -1.118955 -0.396663 -0.443950 -1.910372 0.544590 -0.815494 0.749396
95 -0.033259 0.401438 -0.246596 0.428934 -0.203506 -0.102411 -0.084117 -1.739376 -0.058485 0.271038 1.281474 0.966275 -0.296325 0.770465 -0.130952 -0.088912 0.230252 -0.424248 -0.198729 0.762599 -0.166308 -0.235795 -0.352122 0.948822 0.575000 0.011953 0.717391 -0.181093 -0.503706 -0.986462 0.948005 1.284944 18.937924 -0.114165 0.552198 -0.093685 7.685327 0.068729 14.426556 1.102205 0.112933 -0.189147 -0.301910 16.325515 -0.277827 -0.511120 6.866468 -0.025807 0.329907 0.232352 -0.396663 -0.232026 0.198378 -0.967396 -0.829077 0.749396
96 0.051989 -0.349332 0.090514 0.158632 -0.203506 0.023431 -0.084117 -1.037108 -0.106195 -0.435966 -0.136277 -0.511516 -0.801983 0.664044 -0.003952 -0.373605 0.112084 1.318140 -0.692632 -0.385213 -0.078279 -0.183389 -1.201188 0.217483 0.825943 -0.009465 -0.058117 -0.181093 -0.430749 0.565504 0.187605 0.608145 -0.200253 -0.114165 0.194468 0.572670 0.457717 -0.519858 -0.182108 -1.242803 -0.420340 -0.291834 -0.136910 -0.149688 -0.332409 -0.931159 -0.285912 -0.056401 -0.239782 1.453521 -0.396663 -0.781193 0.304170 0.578020 -0.771828 0.749396
97 0.580528 3.108066 1.077458 0.102006 -0.203506 -0.095787 -0.084117 0.120807 0.065181 -0.484168 0.572598 0.124486 1.323962 -0.321829 -0.130952 -0.437169 -1.518053 0.778150 -0.533309 -0.161250 0.159017 -1.906685 -0.272464 -0.384797 -1.327523 -2.034263 -0.800042 0.391744 -0.090280 -0.187690 -0.984922 1.688696 -0.030408 -0.114165 -0.099690 0.467488 -0.819891 -0.349505 -0.244914 -0.336112 0.869551 -0.304459 -0.104482 -0.254302 -0.495613 0.200355 0.041911 -0.065843 -0.102880 -0.732103 -0.396663 -0.731766 -0.961229 0.687368 -0.828074 0.749396
98 -0.391301 -0.566840 -0.246596 -0.057030 -0.112738 0.722184 -0.084117 0.630590 -0.106195 0.081279 -0.743885 0.710713 -0.949628 -0.204521 -0.130952 -0.236834 1.086896 -0.201117 -0.363487 -0.301227 0.093952 -0.387453 -0.367086 1.862996 -1.327523 -0.517578 -0.312746 -0.181093 1.247276 -0.215710 2.685284 0.071897 -0.217917 -0.114165 1.271361 -0.406191 0.480024 -0.087943 -0.427051 0.231552 -0.420340 -0.217900 -0.167466 -0.379963 -0.559811 1.296712 -0.480085 -0.065843 -0.239782 0.109344 2.163914 0.249139 0.263194 1.207880 1.301265 -1.334408
99 0.162812 0.161471 0.062828 -0.199481 -0.203506 -0.016308 -0.084117 0.460663 -0.106195 0.712694 0.673866 0.206628 0.385904 0.156114 -0.130952 -0.090486 -0.592746 -0.588512 1.906439 -0.049268 0.166672 1.194771 1.113484 0.701457 0.575733 2.628984 2.121245 -0.181093 -0.065961 0.057332 0.049585 3.361000 -0.217723 -0.114165 -0.605919 0.099047 1.189806 0.537933 -0.131863 1.102205 -0.420340 -0.304459 -0.277120 -0.085929 0.025939 0.547303 -0.446662 -0.050822 -0.080799 0.052297 -0.396663 0.641433 1.267840 -0.892143 0.551552 0.749396
100 -0.246379 0.682486 0.159617 -0.350483 -0.203506 -0.503117 -0.084117 -0.176943 -0.106195 -0.312119 -0.237545 -0.345136 -0.144397 -0.004863 -0.130952 -0.372030 -0.478471 -0.771210 -0.762043 -0.469199 -0.059142 -0.098958 -1.506004 -0.384797 -1.327523 -0.438885 -0.068084 -0.181093 0.444742 -0.439192 -0.157732 -1.868082 -0.066318 -0.114165 1.295803 0.284484 0.763937 -0.238869 0.006310 -0.166190 0.302338 -0.279280 -0.133559 -0.108214 -0.506234 -0.183601 -0.095370 -0.065843 -0.239782 1.996362 -0.005777 -0.759748 -1.022320 -0.464962 -0.819677 0.749396
101 -0.357202 1.861762 -0.246596 -0.438727 -0.203506 -0.112346 -0.084117 -0.379954 -0.078131 0.283582 -1.047688 -0.286381 1.988897 -0.246773 -0.130952 0.174134 0.328136 -0.413694 -0.574436 -0.581181 -0.124207 -0.697652 0.913074 1.067127 0.614575 -0.700065 0.579968 -0.181093 0.906807 -0.077304 0.788222 1.268830 -0.039142 -0.114165 -0.415080 -0.296145 -0.994295 -0.060661 0.131922 -0.418804 -0.420340 -0.033251 -0.330986 0.305291 1.611037 -0.081829 0.305136 -0.041380 -0.239782 -0.632271 0.426172 -0.755967 -1.552022 0.493815 -0.818505 0.749396
102 -0.553272 -0.684883 -0.246596 -0.447754 -0.203506 0.238687 -0.084117 -0.190477 -0.068081 0.060981 -0.541349 0.182689 -0.735991 -0.171667 -0.130952 -0.319684 0.065078 -0.371922 -0.514165 -0.441204 -0.166308 -0.528525 -0.204779 -0.944056 0.907151 0.053237 1.490468 -0.181093 1.320233 -0.601128 -0.652829 -1.483130 -0.217917 -0.114165 1.186928 -0.698026 0.471912 -0.563096 -0.427051 1.102205 -0.420340 -0.077710 -0.293827 -0.379963 -0.294788 -0.092932 -0.480085 -0.054606 -0.239782 0.738647 -0.157878 0.166882 -0.271350 -0.482237 1.301265 -1.334408
103 -0.092932 -0.041925 -0.246596 -0.309882 -0.203506 0.026743 -0.049801 0.537356 -0.076863 0.420378 0.876402 -0.038696 0.259778 -0.220451 -0.130952 -0.009539 1.007261 -0.072530 1.246297 -0.245236 -0.105070 -0.055816 -1.506004 1.314492 0.923814 0.540708 -0.229110 -0.181093 0.347465 -0.986462 2.271581 0.890144 -0.217917 -0.114165 -0.362248 -0.116789 -0.193255 -0.350827 -0.427051 -0.840871 -0.420340 -0.064081 -0.362478 -0.379963 -0.469548 -0.226160 -0.480085 -0.050425 0.144427 -1.167089 0.786167 0.015618 -0.138739 -0.439085 1.301265 -1.334408
104 -0.561797 0.182152 -0.246596 -0.397736 -0.203506 0.730463 -0.049801 -0.404014 -0.106195 -0.529463 -0.845152 -0.531572 -0.495958 -0.222831 -0.130952 -0.386003 1.533910 -0.717545 -0.019645 -0.945121 -0.166308 -0.690638 -0.046960 -0.653671 -1.327523 -0.479298 -0.586841 -0.181093 -0.357791 -0.393106 0.552843 -1.584292 -0.101257 -0.114165 -0.325833 -0.375184 -0.239898 -0.066614 -0.131863 -0.141388 -0.384322 -0.304459 -0.377017 -0.028979 -0.191924 -0.791454 0.151679 -0.065843 -0.239782 -0.827479 0.956233 -0.547549 0.821206 -1.030585 -0.821094 0.749396
105 -0.519173 -1.594110 0.708986 -0.443694 0.274883 1.619635 -0.010584 0.535852 -0.061925 -0.517531 -0.845152 -0.138246 -0.969231 -0.183235 -0.130952 -0.315289 1.094506 -0.189224 -0.434133 -0.189245 7.790769 -0.206945 -0.571065 -0.201962 0.643710 -0.515834 0.975379 0.619182 -0.455068 -0.267641 -0.858213 -0.905702 -0.217917 7.068255 0.085594 -0.173940 1.193862 -0.492013 -0.427051 1.102205 -0.280576 0.936377 -0.306444 -0.379963 -0.353492 -0.035570 -0.480085 -0.061807 -0.045469 -0.433497 3.176948 -0.091351 -0.583137 3.020809 1.301265 -1.334408
106 -0.357202 -1.594110 0.596441 -0.338216 -0.203506 -1.705238 -0.036320 0.546378 -0.106195 0.061655 -0.440081 -0.296574 0.205616 -0.166350 -0.130952 1.057206 -0.001196 0.085639 0.541569 -0.609176 0.285320 0.097431 -0.711271 1.249962 1.013348 -0.874993 -0.203869 0.371613 -0.455068 0.245530 0.748133 -0.675626 -0.217917 0.123153 -1.274472 -0.553324 0.208279 -0.549695 -0.427051 1.102205 -0.128652 2.330187 -0.444633 -0.379963 0.306792 0.167973 -0.480085 -0.051489 -0.094047 1.244050 -0.396663 0.757249 -1.140032 1.396782 1.301265 -1.334408
107 0.375932 0.108495 1.640859 0.495753 -0.203506 2.709160 -0.084117 -0.601010 -0.106195 -0.542627 -1.047688 0.000908 1.011734 -0.041639 -0.130952 -0.379509 -0.151329 5.307155 -0.344344 1.462484 -0.166308 0.071493 -1.506004 1.110147 -1.327523 -1.340949 0.710097 -0.181093 -0.333472 -0.338591 -0.693276 -1.381968 -0.180842 -0.114165 -1.090299 -0.698026 0.033875 -0.837864 -0.420771 0.183805 -0.420340 -0.043481 -0.210046 -0.127403 -0.511779 -0.810883 6.062280 -0.061168 0.095849 -1.627033 -0.396663 -0.661445 -0.767527 -0.428315 -0.831054 0.749396
108 -0.297528 -0.432499 -0.246596 -0.396570 0.215513 -0.327601 -0.084117 -1.739376 -0.106195 0.000199 -0.136277 0.198222 2.648425 -0.137060 -0.130952 1.061142 0.598981 -0.392436 0.304930 -0.357217 -0.166308 0.546848 -0.118214 -0.836506 -0.354559 0.299299 -0.409064 -0.181093 -0.090280 0.513594 0.152026 -0.381988 0.083146 -0.114165 -1.058945 -0.257842 0.204223 1.191894 0.370584 -1.436835 -0.200381 -0.186541 -0.191516 0.789365 0.462871 0.613917 0.728424 -0.043713 0.131178 -0.416561 -0.280284 -0.686381 -0.755979 1.794938 -0.820665 0.749396
109 -0.178180 -0.556511 -0.246596 -0.138579 -0.203506 0.076417 -0.079827 -0.246117 -0.106195 -0.621475 0.268795 -0.504302 2.648425 -0.089862 -0.130952 1.269938 -1.116057 -0.387530 0.074467 1.014557 -0.166308 -0.337959 -0.078040 -0.449327 -0.348932 1.028276 0.387642 -0.181093 0.833849 -0.596680 -0.274276 0.767497 -0.159490 -0.114165 -0.186716 0.156806 0.916033 2.951613 -0.031374 -1.653339 -0.420340 -0.304459 -0.456658 -0.061787 -0.227744 -0.211357 -0.181150 -0.054669 -0.023388 -0.855112 -0.396663 1.698956 0.523948 -0.483776 -0.802280 0.749396
110 0.307734 -0.168799 -0.246596 0.710120 -0.203506 -0.327601 -0.084117 -1.739376 -0.106195 -0.508007 0.066259 0.231891 -0.298973 -0.075749 -0.130952 0.254294 -0.124606 -0.041312 0.259727 -0.525190 -0.166308 0.922157 -0.429938 0.023893 -1.327523 -0.147760 -0.340809 -0.181093 -0.698260 -0.422207 0.393347 0.771973 -0.217723 -0.114165 -1.035985 -0.377008 0.228558 0.569660 -0.357965 -0.977921 -0.420340 -0.050594 -0.307922 -0.155259 -0.042882 -0.890450 -0.338976 -0.054732 -0.005724 -0.982577 -0.068613 -0.455224 0.993304 -0.785767 -0.486512 0.749396
111 0.708400 -0.354458 -0.179594 0.084772 -0.203506 -0.330913 -0.084117 -0.819059 -0.106195 -0.274152 0.370063 -0.642905 0.758789 -0.222506 -0.130952 0.835487 -0.085233 -0.197252 0.126834 -0.161250 -0.166308 -0.025908 -0.148259 0.013138 0.647161 -0.053367 -0.574092 -0.181093 0.590657 -0.116126 0.211157 0.414773 -0.050789 -0.114165 -0.122527 -0.285810 1.362182 -0.340922 0.270095 -1.446539 -0.420340 -0.236793 0.015226 -0.011027 -0.271059 -0.105884 -0.148878 -0.059673 -0.239782 -0.452215 -0.396663 0.343970 -0.383847 0.703419 -0.813865 0.749396
112 -0.280478 -0.264370 0.506405 -0.227297 -0.203506 -0.542857 -0.035095 -0.080700 -0.106195 0.250022 0.673866 -0.088250 -1.023056 0.184418 -0.130952 -0.354122 -0.179045 -0.149533 -0.145992 1.210525 0.304457 -0.476119 -0.211800 -0.115922 0.667754 -0.208039 -0.048969 -0.181093 -0.382110 -0.744193 -0.384448 -1.602197 0.129732 -0.114165 1.307653 -0.502253 1.423020 -0.134986 0.332901 0.063634 -0.140966 -0.287204 -0.324481 0.026733 -0.217339 0.241989 0.006772 -0.046694 -0.151458 0.267116 -0.396663 -0.768576 -0.025870 -0.735972 -0.822734 0.749396
113 -0.681144 -1.137389 -0.246596 -0.491681 -0.203506 -0.667043 0.172024 -0.513791 0.080843 -0.183358 -1.351492 -0.574022 -0.885762 -0.321829 -0.130952 -0.437169 -0.476258 -0.546146 0.386691 -0.189245 -0.166308 -0.033319 -0.623326 -0.040637 -1.327523 -0.232655 0.128352 -0.181093 0.444742 -0.035009 -0.762572 0.286754 -0.217917 -0.114165 -0.058831 -0.176372 -0.594789 -0.629136 -0.427051 0.106039 -0.420340 0.005539 -0.353903 -0.379963 -0.077225 -0.448207 -0.480085 -0.057742 -0.191204 -1.067256 -0.396663 -0.105696 0.313482 -0.168671 1.301265 -1.334408
114 -0.715244 -0.301525 -0.246596 -0.427971 -0.203506 -0.741555 0.025570 -1.271698 -0.034902 -0.405826 0.471331 -0.719421 -0.986384 -0.212044 0.079104 -0.353598 -0.883035 -0.594013 -0.221208 -0.833139 -0.166308 -0.522702 -0.081609 0.109933 -1.327523 -0.603443 -0.259926 -0.181093 0.250189 -0.614873 0.411530 -0.999702 -0.217917 -0.114165 -0.905879 -0.262098 -0.846255 -0.645875 -0.427051 1.102205 0.739485 -0.191137 -0.430439 -0.379963 -0.337428 -0.357538 -0.480085 -0.045977 -0.239782 -0.673273 -0.396663 -0.473479 -0.382729 -0.548119 1.301265 -1.334408
115 0.043465 -0.600142 -0.246596 0.462106 -0.203506 -0.019620 -0.084117 0.275697 0.286079 -0.916956 1.078938 0.151403 -0.668957 -0.140049 -0.130952 -0.437169 1.422618 0.017555 -0.471679 -0.609176 -0.166308 -1.419287 -0.554258 -1.933515 -1.327523 -1.249076 -1.057882 0.169963 -1.743986 -0.455416 -2.060234 -1.149207 -0.217917 0.307734 -0.618757 -0.424431 -1.302543 -0.649621 -0.427051 1.102205 -0.420340 0.014519 -0.115596 -0.379963 1.668105 -1.423362 -0.480085 -0.065843 -0.239782 -1.456782 4.644503 -0.737182 0.353713 -1.164131 1.301265 -1.334408
116 -0.306053 0.719364 -0.246596 -0.446199 -0.203506 -0.198448 -0.084117 1.200525 -0.068444 -0.019912 0.066259 -0.245276 -0.505483 1.180025 -0.130952 -0.269371 -0.258573 0.069436 -0.658174 -1.001112 -0.131862 -0.133366 0.691483 0.400318 0.901530 -0.548591 -0.493479 -0.181093 0.250189 0.088973 1.278523 0.456849 -0.217917 -0.114165 -0.643445 -0.263922 0.285341 0.518154 -0.427051 1.102205 -0.420340 0.391354 -0.050222 -0.379963 -0.298564 0.556555 -0.480085 -0.044453 0.113513 0.284943 1.495252 -0.716595 -1.286799 1.335307 1.301265 -1.334408
117 -0.254904 -0.693916 -0.246596 0.733487 -0.203506 0.513552 -0.084117 0.681719 -0.106195 -0.384483 -0.035009 0.009425 -0.744385 0.077454 -0.130952 -0.345332 1.043629 -0.267714 -0.073494 0.314672 -0.017041 0.477239 -0.229413 1.411287 -1.327523 1.260770 -0.011503 -0.181093 -0.528026 0.329484 0.576395 0.888354 -0.217917 -0.114165 -0.173878 -0.440239 -0.866534 -0.078510 -0.427051 0.345748 -0.420340 -0.289105 -0.167860 -0.379963 0.486769 0.656476 -0.480085 -0.065843 0.449144 -0.140238 -0.310777 2.905832 0.348498 -0.037712 1.301265 -1.334408
118 0.188387 1.111451 -0.246596 -0.115730 -0.203506 0.076417 -0.084117 -0.264162 0.262405 -0.418905 0.167527 0.401383 2.648425 0.014042 0.459076 -0.386003 -0.793093 0.171562 0.066810 -0.245236 -0.166308 -0.639688 -0.220319 0.066913 0.555532 -0.481333 -0.662336 -0.181093 0.006997 -0.109708 0.505739 -0.599531 0.188936 -0.114165 -1.195964 0.785466 -0.852339 -0.222912 -0.144425 0.017176 0.184277 -0.061264 -0.317039 0.254532 -0.447765 1.392932 1.699780 0.394939 -0.235365 1.710234 -0.396663 -0.500538 0.481854 -0.350614 -0.829375 0.749396
119 -0.169656 -0.180110 -0.246596 0.505774 0.085354 0.771859 -0.084117 -1.739376 -0.106195 -0.419213 1.078938 0.255587 -0.654085 -0.053220 -0.130952 -0.394793 0.014710 0.579323 -0.642365 -0.749153 -0.166308 -0.247970 -0.173468 -0.040637 0.788469 -0.447995 -0.659327 -0.181093 -1.111686 -0.215795 -0.880763 -1.269169 -0.164343 -0.114165 0.105838 -0.197043 0.163664 -0.445695 -0.069057 -1.414063 0.532533 -0.184182 -0.336703 -0.251207 -0.094317 -0.879348 -0.344153 -0.065843 -0.160290 1.137978 -0.249487 -0.600289 -0.213612 -0.311169 -0.802668 0.749396
120 -0.271953 -1.594110 -0.246596 -0.375276 -0.203506 -0.589220 -0.084117 -0.449128 -0.106195 0.155408 -0.136277 -0.596108 -1.063530 -0.165186 -0.130952 -0.437169 -0.304492 -0.671611 -0.432404 -0.245236 -0.166308 -0.572990 -0.614922 -1.191421 1.314977 -0.776335 -0.243484 -0.109551 -0.892814 -0.490933 -1.538721 -1.799149 -0.057583 -0.114165 2.289497 -0.698026 0.129189 -0.354320 -0.043935 -1.382062 0.367140 -0.304459 -0.232421 -0.082834 -0.596404 -0.283522 0.008318 -0.064275 -0.239782 -0.511045 -0.106232 -0.753095 -0.053435 -1.208961 -0.821763 0.749396
121 0.086089 0.577711 0.354619 -0.389227 0.266320 0.106222 0.130355 0.063663 -0.084468 -0.019330 1.990349 0.259184 -0.673238 0.071762 -0.130952 -0.358649 0.420114 -0.270687 0.175989 0.090709 -0.040005 0.528585 2.033689 0.260503 -1.327523 -0.465246 -0.692481 -0.181093 1.174318 0.172546 0.879782 1.563363 -0.217917 -0.114165 0.251251 0.243140 -0.566398 -0.262302 -0.427051 1.102205 -0.420340 0.326559 0.120494 -0.379963 -0.058798 -0.344585 -0.480085 -0.050038 -0.226533 2.150567 1.579927 1.225063 -0.644600 1.469588 1.301265 -1.334408
122 -0.408350 -0.538212 -0.246596 -0.313510 -0.203506 -1.016420 -0.032644 -0.802517 -0.106195 -0.370645 -1.452760 -0.700755 -0.280947 0.084680 -0.130952 -0.169072 -1.519450 -0.306959 -0.756608 -0.133254 -0.166308 -0.903304 -0.468615 -1.589355 0.700626 -0.112096 1.409841 -0.181093 -1.549432 -0.986462 -1.695783 -0.826026 -0.217917 -0.114165 -0.571356 -0.201907 -1.201146 -0.141857 -0.427051 -1.027363 0.531070 -0.236529 -0.235870 -0.379963 -0.688996 0.097659 -0.480085 -0.065843 -0.208868 -1.014666 -0.396663 0.538809 0.781721 -0.579730 1.301265 -1.334408
123 -0.442450 -0.623808 -0.246596 -0.424473 -0.203506 -0.933629 -0.084117 -0.155890 -0.106195 -0.639520 -0.946420 -0.320138 1.629841 -0.238343 -0.130952 -0.401090 -0.159779 -0.589107 -0.529480 -1.057103 0.602991 0.213093 -0.575669 -0.965566 -1.327523 -0.984116 -0.400157 -0.181093 -0.406430 0.277129 -0.408072 -0.273664 -0.217917 -0.114165 6.928878 -0.698026 -0.485280 -0.639398 -0.427051 1.102205 -0.420340 -0.300550 -0.248438 -0.379963 0.374861 0.665728 -0.480085 -0.055942 -0.239782 -0.893440 -0.396663 -0.431560 0.005048 0.744612 1.301265 -1.334408
124 0.154287 -0.571740 -0.123997 0.161440 -0.203506 -0.082541 -0.058380 0.582469 -0.077497 0.332577 -0.440081 -0.545869 -1.015648 -0.202246 0.350877 -0.351564 0.250868 0.383396 -0.052004 0.622622 -0.166308 -0.490412 0.189595 -0.395552 0.673729 -0.131672 2.759222 -0.181093 0.006997 -0.005697 -0.074476 -0.352445 -0.217917 -0.114165 0.666257 -0.698026 1.412881 -0.614294 -0.427051 1.102205 -0.091403 -0.304459 0.779752 -0.379963 -0.368082 -0.065176 -0.480085 -0.065843 -0.200036 -0.096561 -0.094883 0.342859 0.328382 -0.213116 1.301265 -1.334408
125 -0.016209 0.214700 0.373677 -0.117069 -0.203506 -0.254746 -0.084117 0.138852 -0.106195 0.370954 -0.136277 -0.278614 0.073828 1.487934 -0.130952 0.172625 -0.670676 -0.106869 0.714354 -0.245236 0.166672 0.421392 0.179350 -0.847261 0.554929 -0.336449 -0.402562 -0.181093 -0.090280 -0.303582 -0.290025 0.460430 -0.217723 -0.114165 -0.328672 -0.076054 0.548974 0.585937 -0.307720 -1.453547 -0.420340 -0.304459 -0.208321 -0.365726 -0.399649 -1.036631 -0.480085 -0.041811 -0.036637 0.700318 -0.396663 -0.365492 -1.019713 0.293094 -0.255831 0.749396
126 -0.561797 0.804705 -0.246596 -0.329448 -0.203506 -0.811099 -0.084117 1.335866 -0.106195 0.350844 -1.148956 -0.266854 -0.847527 -0.185307 -0.130952 -0.177206 0.940656 -0.197400 -0.228865 -0.413208 -0.166308 1.395924 0.062511 -0.244982 0.748353 0.180678 -0.273911 -0.181093 0.250189 -0.269336 0.407163 -0.297836 -0.150755 -0.114165 -0.542471 -0.485230 0.581421 -0.101785 -0.043935 1.102205 -0.004435 -0.304459 -0.216650 -0.012885 -0.055670 0.532500 -0.098861 -0.059407 0.095849 0.439149 -0.145923 -0.564022 0.604036 -0.304595 -0.806993 0.749396
127 1.253988 -0.077458 0.042570 0.122566 -0.203506 0.987114 -0.084117 -0.166416 -0.106195 0.352394 0.775134 -0.197288 -0.873186 -0.076269 -0.130952 -0.330638 0.697350 0.737790 0.346675 0.566631 -0.166308 0.131044 -0.475177 -0.406307 0.692655 -1.081029 1.702594 -0.181093 0.006997 -0.687221 -0.992582 1.565154 -0.208017 -0.114165 0.411476 -0.698026 -0.923317 0.284815 -0.069057 -1.519817 -0.420340 -0.283929 2.133504 -0.311871 1.493028 2.196001 -0.454863 -0.055100 0.833352 -0.021686 -0.396663 -0.035914 1.173968 -0.814302 -0.680493 0.749396
128 0.120188 0.240001 0.540019 -0.036470 -0.203506 0.559915 -0.084117 1.059169 -0.106195 0.439576 -0.035009 -0.163994 -0.574765 -0.130788 -0.130952 -0.158183 -0.248881 0.781643 0.884793 0.958567 0.216428 0.742443 1.226293 -0.987076 0.996144 -0.651221 -0.522710 0.054165 -0.844175 -0.344627 1.205003 1.569630 -0.018761 -0.114165 -0.665170 0.592734 0.411074 0.342837 0.496196 0.224862 1.485175 -0.096743 0.245525 0.702083 -0.495752 0.700886 0.302895 -0.061981 0.382901 1.114803 -0.091503 0.932986 1.871670 -0.593928 -0.811817 0.749396
129 -0.621471 -0.872164 -0.246596 -0.392294 -0.203506 -1.062782 -0.084117 -0.037090 -0.106195 0.103368 -0.743885 -0.346371 -0.334405 -0.087680 -0.130952 -0.339494 -0.506697 -0.357353 0.146101 -0.805144 -0.124207 -0.551287 -0.320581 0.249748 0.689441 -0.292450 -0.361201 -0.181093 -0.163238 -0.713208 0.355692 0.942068 -0.167254 -0.114165 0.768712 -0.086997 -1.223453 -0.516655 0.282656 -1.330943 -0.420340 -0.304459 -0.441725 0.164774 -0.574568 0.101359 -0.224435 -0.065843 -0.076383 -0.718733 -0.396663 -0.659905 1.006714 0.437305 -0.770510 0.749396
130 -0.331627 -0.010392 -0.136452 -0.376054 -0.203506 0.122780 -0.036933 -0.896504 -0.095060 -0.061242 0.268795 0.245780 -0.543929 0.070314 0.253099 0.207424 0.478269 -0.083010 -0.640142 0.342668 -0.122293 0.052436 0.591681 -0.223472 0.738915 0.889885 -0.772530 -0.181093 -0.625302 -0.270077 -0.435562 0.023555 -0.217723 -0.114165 -0.510623 0.197541 -1.134223 0.468418 -0.383087 -0.031942 0.287176 -0.177262 0.143066 -0.379963 -0.371328 0.128190 -0.468161 -0.049704 -0.204452 0.059873 -0.310427 1.687249 0.115682 -0.513010 -0.601852 0.749396
131 -0.280478 -1.074231 -0.246596 -0.359467 0.762978 -0.354094 -0.084117 1.726850 -0.106195 -0.740646 -0.541349 -0.453997 -0.953458 0.624966 -0.130952 -0.254939 -0.662226 -0.552835 0.062857 -0.245236 -0.097416 -0.730207 0.475533 -0.040637 0.738753 -0.208039 -0.338821 -0.181093 0.225869 0.772128 -0.565493 -1.023873 -0.217917 -0.114165 -0.512104 -0.504685 -0.452833 -0.162665 -0.427051 1.102205 0.778428 0.029168 3.371541 -0.379963 -0.237011 -0.348286 -0.480085 -0.065843 -0.239782 -0.306031 -0.396663 2.448650 -0.842400 1.365870 1.301265 -1.334408
132 -0.306053 0.860697 -0.246596 -0.375211 -0.203506 -0.365685 -0.084117 -0.000248 -0.106195 -0.152551 0.167527 -0.132179 0.081968 -0.018888 -0.130952 -0.229127 0.106487 -0.586431 1.457122 -0.119257 -0.166308 -0.130322 0.012668 0.443338 0.678989 -0.998750 -0.028886 -0.181093 -0.175397 -0.210818 0.360238 -0.543578 -0.217917 -0.114165 -0.526300 0.020009 0.306634 -0.057987 -0.427051 0.626342 -0.066775 -0.304459 -0.311717 -0.379963 -0.596464 0.735118 -0.480085 -0.047913 -0.076383 0.329065 -0.206703 -0.479838 -0.948937 -0.497449 1.301265 -1.334408
133 0.000841 2.067097 0.026964 -0.086833 -0.196370 1.235486 -0.084117 0.272689 -0.106195 0.116037 -0.946420 -0.116536 0.372007 0.013123 -0.130952 -0.418145 0.805565 -0.012027 1.379931 1.210525 -0.166308 0.931155 -0.344524 0.626172 -1.327523 1.306028 0.486848 -0.181093 -0.187557 -0.226130 1.643546 -1.425835 -0.140079 -0.114165 1.318022 -0.374576 3.440830 0.065916 -0.420771 1.102205 -0.420340 -0.276991 -0.099061 -0.135450 -0.032475 0.343760 8.687544 -0.065843 0.225685 0.448954 0.464843 -0.519461 -0.636778 -0.314596 -0.831821 0.749396
134 -0.161131 -0.484339 -0.076128 0.390233 -0.203506 0.735431 -0.084117 0.502769 -0.106195 -0.354265 -0.237545 -0.219065 -0.572658 -0.070351 -0.130952 0.352952 0.093812 0.928812 -0.467974 0.874580 -0.139517 0.004530 0.143435 -0.858016 0.895138 1.881011 2.508005 -0.181093 1.368872 -0.122585 -1.201975 1.139916 -0.113874 -0.114165 -0.293615 -0.035318 0.220447 0.370779 0.018871 -1.516162 -0.420340 0.016526 -0.117000 -0.028979 -0.261177 0.182777 -0.088374 -0.060689 0.528635 0.598703 -0.239333 2.307432 -0.024007 -0.806888 -0.813310 0.749396
135 0.572003 -1.116108 -0.246596 -0.263536 -0.020257 -0.880643 -0.084117 0.406526 -0.106195 -0.213143 -0.743885 -0.092487 -1.024605 -0.071851 -0.130952 -0.134568 1.148791 -0.523550 -0.466368 -0.385213 -0.166308 -0.982177 -0.536301 -1.804455 1.822881 -0.090581 0.530654 0.187766 -1.038729 -0.101830 -0.482595 -0.132217 -0.160461 -0.114165 -0.732569 -0.562444 -0.201367 -0.225535 -0.295159 -0.789047 0.348823 -0.166487 1.006306 -0.165163 -0.088700 -0.455608 0.144266 -0.045024 -0.239782 1.021209 -0.396663 -0.060380 -2.043356 -0.269836 -0.821960 0.749396
136 -0.024734 0.025026 -0.246596 -0.273859 1.818518 0.689068 -0.021614 -1.739376 -0.106195 -0.733344 0.673866 -0.420593 1.345086 -0.321829 -0.130952 -0.320340 -0.697398 0.415654 -0.601114 -0.273231 -0.166308 -1.122454 -0.512358 0.185218 -1.327523 -1.593504 -0.975750 -0.023700 -0.284834 1.558510 -0.463051 -0.057017 -0.217917 -0.114165 0.060165 -0.295537 -0.842199 -0.183023 -0.427051 0.382769 0.326350 -0.032212 -0.195902 -0.379963 -0.184209 -0.820135 -0.480085 -0.065843 -0.116128 -0.870265 -0.396663 0.018293 -0.885611 0.361703 1.301265 -1.334408
137 -0.604421 0.648214 -0.246596 -0.477492 -0.203506 -0.703471 -0.084117 0.278705 -0.059934 -0.394412 -1.452760 -0.161358 0.731328 -0.141981 -0.130952 0.025719 0.191873 -0.262809 -0.622604 -0.525190 -0.166308 -0.131248 0.187523 1.260717 -1.327523 -0.361743 0.017681 -0.181093 0.493381 0.034374 0.393597 0.700802 -0.153667 -0.114165 -0.764416 -0.698026 0.633134 -0.362762 -0.232353 1.102205 -0.420340 0.017512 -0.297967 -0.189924 -0.429821 0.208682 -0.078001 -0.065843 -0.147042 -0.896560 -0.396663 -0.712741 -0.090685 0.697159 -0.818761 0.749396
138 -0.570322 -0.518227 -0.167364 -0.408060 -0.203506 -0.665387 -0.084117 1.167442 -0.106195 -0.231448 0.370063 -0.292647 0.797940 0.020056 0.367261 -0.331819 0.512423 -0.641880 -0.547759 -0.525190 -0.166308 0.867104 0.076785 0.927312 -1.327523 0.097236 -0.306715 -0.181093 2.122767 -0.513912 -0.175628 -1.141150 -0.202194 -0.114165 0.143117 0.716763 2.877060 -0.313691 -0.383087 -1.371942 -0.420340 -0.304459 -0.367653 -0.224589 -0.428398 0.082855 0.214679 -0.054785 -0.133793 0.377645 -0.396663 -0.456099 1.212709 0.323866 -0.818964 0.749396
139 -0.634258 -0.678408 -0.182070 -0.154430 -0.203506 -0.171955 -0.084117 1.629856 -0.106195 -0.253163 -2.060368 -0.331809 -0.035764 -0.137697 -0.130952 0.140187 -0.029776 -0.083084 0.230579 -0.133254 -0.110811 0.695066 -0.088976 -0.067524 0.627731 0.193179 0.723450 -0.181093 0.505540 -0.196056 0.443995 0.257659 -0.182783 0.092728 -0.736025 -0.698026 -0.387938 -0.414805 0.018871 1.102205 0.540922 -0.184235 -0.279683 -0.083453 -0.101713 0.090257 -0.291851 -0.059494 0.003109 -0.685753 -0.396663 1.351600 -0.813345 0.642223 -0.776422 0.749396
140 -0.263429 -0.794793 0.274337 -0.284139 -0.203506 -0.251434 -0.079827 -0.107768 -0.086414 0.104991 -1.047688 -0.480981 -0.620118 -0.072953 0.455608 -0.174779 -0.177365 -0.243781 0.077925 -0.693162 0.147535 -0.020879 0.098657 -0.825751 0.692836 1.811331 -0.562204 5.175759 0.444742 -0.564107 -0.284012 0.344050 -0.207047 -0.114165 -0.752072 -0.431119 1.500082 -0.233947 -0.257475 -1.084866 0.150029 0.483354 -0.290032 -0.251207 -0.491916 2.443954 -0.350430 -0.052104 -0.155874 -0.284639 -0.305931 -0.084073 0.000206 -0.922426 -0.765251 0.749396
141 1.722853 0.259933 0.124878 -0.164883 -0.203506 1.318277 -0.032644 1.469703 -0.106195 0.365389 1.585278 0.309113 -0.935799 0.275164 0.211220 -0.077433 -0.557064 -1.250474 0.366930 2.414328 1.230678 -0.005792 2.405961 -0.158942 0.641696 0.831447 -0.372995 0.250157 -0.552345 -0.164245 0.086094 0.151573 -0.217917 1.618054 -1.400381 1.453037 1.135052 1.034707 -0.427051 1.102205 -0.420340 0.056284 0.770339 -0.379963 -0.148059 -2.200526 -0.480085 -0.045924 -0.005724 1.251181 -0.396663 2.431422 0.446094 -1.279598 1.301265 -1.334408
142 -0.016209 0.811480 -0.246596 0.089091 -0.203506 0.039989 -0.084117 0.620064 -0.008059 1.046616 0.775134 0.717729 -0.080886 0.088652 -0.130952 0.283812 0.766014 0.014285 0.526748 -0.105259 0.323594 1.420274 0.405775 0.023893 0.808639 -0.142914 -0.692669 -0.181093 0.493381 -0.417462 -0.271627 0.774659 -0.217917 -0.114165 -0.228192 0.454720 -0.986183 0.250790 -0.427051 1.102205 -0.318288 0.104106 -0.281408 -0.379963 -0.290173 0.743445 -0.480085 -0.065843 -0.239782 0.424887 -0.011866 1.260282 0.838342 1.048072 1.301265 -1.334408
143 -0.519173 -0.834328 -0.246596 -0.305951 -0.203506 -0.622336 -0.084117 0.092235 -0.106195 -0.296663 -0.743885 -0.523475 -0.513482 -0.230571 -0.130952 -0.362912 0.054261 -0.371624 -0.161430 -0.329222 -0.166308 0.899924 -0.318739 -0.438572 -1.327523 0.897541 0.830675 -0.181093 0.955445 -0.458592 -0.228818 -0.115207 -0.217917 -0.114165 -0.310897 -0.698026 0.185971 0.043836 -0.427051 -0.779644 -0.420340 0.001137 -0.202358 -0.379963 -0.263770 0.077304 -0.480085 -0.065843 0.038438 -0.405864 -0.396663 0.509852 0.574236 -0.354880 1.301265 -1.334408
144 -0.246379 -0.417310 -0.246596 -0.329707 0.272029 -0.046113 -0.084117 -0.348374 -0.036939 -0.061728 0.977670 -0.137231 -0.478360 -0.202999 -0.130952 -0.205282 0.524589 -0.332528 0.308388 0.006723 -0.074451 -0.711944 0.285253 0.282013 0.560910 0.336998 -0.503634 -0.181093 0.517700 0.602398 0.408953 0.092488 0.212229 -0.114165 -0.360520 0.523423 -0.327100 0.089570 0.182166 -0.801954 0.290485 0.598806 -0.254154 0.287959 -0.155376 -0.105884 0.616535 -0.046979 -0.239782 -0.061798 1.137745 -0.405109 -0.595430 1.304255 -0.826169 0.749396
145 0.895946 2.704413 3.455009 0.450833 2.733909 0.258557 0.059886 -0.131829 -0.046625 -0.499480 1.787814 0.241974 2.648425 -0.172932 0.298627 0.265248 0.216228 0.739276 -0.679170 0.538635 -0.166308 -1.152892 0.278692 0.034648 -1.327523 -0.976945 -0.740035 0.728991 0.201550 3.539015 -0.807457 -1.031931 0.291813 -0.114165 -0.406193 0.551998 -0.840171 -0.192998 -0.420771 1.102205 0.299105 0.143335 -0.318567 -0.379963 2.108649 -1.118048 2.132895 -0.061028 0.321074 1.319816 0.323166 0.701169 0.371966 1.632823 -0.832423 0.749396
146 -0.638520 -0.995702 -0.246596 -0.451296 -0.203506 -0.817722 0.001672 -0.244613 -0.106195 -0.780139 -3.174315 -0.619804 -0.917322 -0.321829 -0.130952 -0.437169 -0.344729 -0.734046 -0.048793 -0.693162 -0.166308 -0.320755 -1.506004 -0.126677 -1.327523 0.130186 -0.403502 -0.181093 0.882488 -0.249173 0.078363 -1.619206 -0.217917 -0.114165 -0.350891 -0.698026 -1.188978 -0.541786 -0.427051 -0.940091 -0.420340 -0.202793 -0.315265 -0.379963 -0.556033 0.166123 -0.480085 -0.065843 0.162092 -0.765975 -0.396663 0.030629 -0.633053 -0.444471 1.301265 -1.334408
147 -0.578847 -0.759025 -0.246596 -0.437215 -0.203506 -0.582597 0.175701 -0.343863 -0.022725 0.301182 -1.554028 -0.241989 -0.779902 0.545509 0.362169 -0.195377 -0.088263 -0.456952 -0.007294 -0.133254 0.097780 -0.607927 1.579917 0.260503 0.698649 -0.345074 2.121782 -0.181093 1.174318 0.244937 0.517336 0.193650 -0.217917 -0.114165 2.071008 0.846265 -1.225481 -0.215769 -0.427051 -0.295702 1.420141 -0.304459 -0.379629 -0.379963 -0.236278 0.047698 -0.480085 -0.055409 -0.239782 -0.832828 -0.396663 -0.681875 0.655814 -0.230881 1.301265 -1.334408
148 -0.340152 -0.473939 -0.246596 -0.218831 0.687908 0.573161 -0.084117 -0.404014 -0.082974 -0.356813 -0.035009 -0.211122 0.368920 -0.060060 -0.130952 3.072358 0.960787 0.993923 -0.251343 -0.077263 -0.147171 1.143424 -0.967856 0.636927 0.394260 -0.124597 -0.378530 0.197415 0.250189 2.286713 1.002841 1.515020 -0.126492 -0.114165 -0.592834 1.669481 -0.241926 0.033600 -0.169547 -0.176463 0.317653 2.084721 -0.323693 0.120205 -0.034248 0.673130 0.416397 -0.049346 0.056103 -0.873830 7.649415 -0.174699 -0.626720 1.689053 -0.819785 0.749396
149 -0.374251 0.254272 -0.246596 -0.421363 -0.203506 0.069794 -0.074312 -0.852142 -0.106195 -0.148932 0.471331 -0.439744 0.639564 -0.185739 -0.130952 0.197421 1.429364 -0.181494 -0.179709 -0.665167 -0.166308 -0.561344 -0.609973 0.970332 0.768685 -0.835452 1.804607 -0.181093 0.614977 -0.388659 1.583127 1.128278 -0.130374 -0.114165 -0.746147 0.044936 -0.505559 -0.545622 -0.194669 1.102205 -0.420340 -0.304459 -0.405207 -0.240065 -0.229564 -0.710962 -0.179049 -0.050231 -0.049886 -0.221352 -0.396663 -0.445319 -0.635288 -0.827101 -0.820492 0.749396
150 -0.254904 0.533855 -0.246596 -0.391214 -0.203506 0.864584 -0.084117 -0.231079 -0.106195 -0.845491 0.876402 -0.447422 -0.674111 -0.159674 -0.130952 -0.234932 -1.227326 -0.151912 -0.797736 -0.133254 0.013578 -0.414979 -0.128574 -0.675181 0.644494 0.544391 -1.141263 -0.181093 -0.163238 -0.732269 0.414322 -0.617436 -0.217917 -0.114165 0.115960 -0.240818 -0.771221 -0.467796 -0.427051 1.102205 -0.420340 0.411603 0.233155 -0.379963 -0.196509 -0.061475 -0.480085 -0.052544 -0.085215 -0.642076 4.723567 -0.326253 -0.422587 0.035862 1.301265 -1.334408
151 -0.003422 0.715073 -0.246596 0.173987 -0.203506 -0.304420 -0.084117 -0.077693 -0.106195 0.033742 0.268795 -0.452210 -0.691433 -0.189945 -0.130952 0.530131 -1.047192 0.519267 0.125969 -0.077263 -0.166308 0.949947 0.169220 -0.664426 -0.295970 0.606221 -0.296290 -0.181093 0.347465 -0.986462 0.558857 1.438477 -0.124550 -0.114165 -0.321019 -0.088213 0.597645 0.021249 0.094238 -1.465911 -0.420340 -0.148034 -0.396139 -0.223970 -0.131816 -0.207656 -0.334424 -0.065843 -0.169123 0.318815 0.003724 0.532787 3.103544 -1.044467 -0.807562 0.749396
152 -0.340152 -0.195308 -0.246596 -0.353377 -0.140710 -0.115657 -0.071861 0.233591 -0.106195 -0.588122 -0.642617 -0.638514 -0.521538 -0.219966 -0.130952 -0.171827 0.353663 -0.365084 0.401512 -0.301227 0.331249 0.277939 -0.299170 0.669192 -1.327523 0.606028 -0.744199 0.240840 2.147086 0.868790 -0.172908 0.377173 -0.217917 -0.114165 -0.780217 -0.175156 -0.763109 -0.239837 -0.427051 -1.186040 -0.208078 0.010328 -0.255189 -0.379963 -0.194528 0.262343 -0.480085 -0.065843 -0.239782 -1.298120 0.086262 -0.233872 -0.101861 0.132376 1.301265 -1.334408
153 0.444131 0.792148 -0.246596 -0.378991 0.126172 0.321477 -0.075538 0.253140 -0.106195 -0.037282 -1.655296 -0.090611 2.648425 -0.062818 -0.130952 0.948314 0.537063 0.025434 0.443257 -0.357217 0.162845 -0.847987 -0.006786 1.755446 -0.350617 0.576857 -0.909725 -0.181093 1.855256 1.167712 1.188753 2.107668 1.316709 0.366557 -1.206333 0.312451 -0.817863 0.092371 0.182166 -0.911038 -0.100022 -0.168670 0.737024 0.068826 0.050201 0.260493 0.945806 -0.048170 -0.155874 -1.141239 0.206260 -0.400807 -1.194045 1.763886 -0.830747 0.749396
154 -0.161131 -1.170206 -0.246596 0.318187 0.946513 -0.466690 -0.009971 0.627583 -0.043457 -0.054850 0.167527 -0.284527 -0.667351 0.021290 -0.130952 0.022472 -0.681161 -1.250474 0.976559 -0.189245 -0.166308 -2.397656 0.200876 0.830517 0.775070 -0.038054 -0.572628 -0.039672 0.639296 -0.348651 0.614766 1.579477 -0.217917 -0.114165 -1.400381 0.144646 -0.945624 -0.135708 -0.427051 1.102205 0.699310 -0.304459 -0.313442 -0.379963 -0.121375 -2.200526 -0.480085 -0.048325 -0.239782 1.280596 -0.396663 4.880725 0.148463 -0.867874 1.301265 -1.334408
155 -0.502123 -0.516198 -0.246596 -0.431427 -0.032816 -0.522987 -0.084117 -1.739376 -0.073151 -0.068389 -1.351492 -0.055177 -0.568039 -0.113823 -0.130952 -0.167170 -0.478625 -0.577958 -0.411902 -0.749153 -0.166308 0.319757 -0.191195 0.131443 -1.327523 -1.208760 0.514386 -0.181093 0.396104 0.200249 -0.451454 0.707516 -0.217917 -0.114165 0.134970 -0.698026 -0.175004 -0.458506 -0.427051 -1.142959 -0.420340 -0.304459 -0.298115 -0.379963 -0.062646 0.515847 -0.480085 -0.055990 -0.239782 -0.761518 -0.396663 1.207521 0.725100 1.384264 1.301265 -1.334408
156 0.716925 0.171644 -0.246596 2.418207 -0.203506 2.450853 -0.084117 0.735103 -0.106195 0.624755 0.471331 0.332346 1.394531 -0.321829 -0.130952 0.594252 0.201341 1.999427 -0.048298 -0.175247 -0.105070 0.617781 -0.087134 -0.723579 0.624821 -0.795912 -0.646653 -0.181093 -0.941452 0.209313 0.374519 1.835515 0.022972 -0.114165 -0.657023 0.064392 -0.234828 -0.246761 0.006310 1.102205 -0.420340 0.039028 -0.371152 0.305911 -0.225472 0.563957 0.700430 -0.049263 0.409398 -0.247647 -0.396663 -0.598931 0.005235 0.130488 -0.824427 0.749396
157 -0.237854 -0.226879 0.164193 -0.230623 -0.178959 0.818221 0.000446 0.200508 -0.073513 0.733761 -0.642617 2.540174 0.218115 -0.075082 -0.130952 -0.437169 -0.204087 0.034799 -0.708564 -0.497194 0.262356 0.089755 2.474798 1.637142 1.288216 0.945804 3.567669 0.016896 0.420423 -0.534244 -0.284799 0.310031 0.268520 -0.114165 0.456655 -0.002487 1.672458 -0.025800 0.295217 0.286994 0.351594 -0.223007 -0.099948 0.275579 -0.370880 -1.173560 0.485605 -0.047870 -0.239782 0.940987 -0.396663 -0.735080 -0.280663 -0.473495 -0.825870 0.749396
158 -0.050308 -0.588133 -0.246596 0.108096 -0.099608 0.189012 -0.084117 -0.904775 -0.031461 -0.145946 1.585278 -0.247593 0.340304 -0.015304 -0.130952 -0.263926 0.011988 0.326609 -0.119685 0.174695 -0.166308 0.123369 0.784263 -0.384797 0.678324 0.292612 -0.657393 -0.181093 0.104274 -0.019210 -0.006468 -0.182350 1.008076 -0.114165 -0.538767 0.118503 -0.353463 -0.252627 2.455742 -0.052679 -0.420340 -0.304459 -0.300973 1.089589 0.497615 -0.305727 0.362435 -0.059784 -0.120545 -0.222243 -0.396663 0.044841 -0.823030 1.139131 -0.822039 0.749396
159 -0.629996 -1.594110 -0.246596 -0.397823 -0.203506 -0.254746 0.011476 -1.739376 -0.094109 -0.459317 1.078938 0.077027 -0.223770 0.081168 -0.130952 -0.007374 -0.431204 -0.769723 0.257380 -0.413208 -0.166308 0.509529 -0.258421 -0.417062 0.617397 -0.180419 -0.641985 -0.181093 -0.795537 -0.986462 3.704185 -0.738293 -0.217723 -0.114165 -0.345707 -0.256018 -0.160808 -0.820254 -0.357965 -0.113733 -0.108411 -0.047918 -0.278796 -0.026503 -0.645372 2.099781 -0.259477 -0.041898 -0.239782 -0.615335 -0.396663 -0.732808 -0.075413 -0.811015 -0.486512 0.749396
160 -0.122769 0.602661 0.868725 0.336112 1.336132 -0.181890 -0.039997 0.352390 -0.028881 -0.305505 -0.743885 -0.002798 0.476203 0.125012 -0.130952 0.794291 -0.701114 -1.250474 0.524772 0.132702 1.605759 0.072022 0.329111 -0.562254 0.708374 0.076303 -0.611652 -0.181093 -0.430749 0.016541 -0.140157 -0.429436 -0.217917 0.323961 -0.136599 0.285699 0.664567 0.206840 -0.427051 1.102205 0.165498 -0.304459 -0.243953 -0.379963 0.277175 -2.200526 -0.480085 -0.046466 -0.041053 -0.484304 -0.286516 0.501683 -0.855438 1.794029 1.301265 -1.334408
161 -0.016209 -1.594110 0.220166 0.636692 -0.203506 0.298296 -0.072474 -1.739376 -0.106195 0.323852 -0.946420 0.275003 0.106298 0.329305 -0.130952 9.944867 -0.688286 1.668074 0.112754 1.042553 -0.166308 1.857516 -1.129703 -1.320481 0.980730 -0.986539 -0.218350 -0.181093 -1.768305 -0.898166 0.419619 -0.339912 -0.131344 -0.114165 1.249636 -0.698026 0.605757 -0.389018 -0.081619 -0.716613 -0.420340 -0.248133 -0.368589 0.010638 -0.328704 -0.970017 0.032298 -0.065843 -0.036637 -0.777563 -0.396663 -0.728944 -1.097939 -1.148325 -0.814591 0.749396
162 -0.323102 -0.580568 -0.246596 -0.364305 -0.203506 -0.344159 -0.084117 -0.061151 -0.076501 -0.074155 -1.047688 -0.448349 -0.732780 -0.167813 -0.130952 -0.364946 0.267792 -0.425735 -0.627791 -0.357217 -0.166308 0.698242 -0.352352 -1.471051 -1.327523 -0.397794 0.020522 -0.181093 -0.455068 -0.986462 -1.017924 -0.204731 -0.217917 -0.114165 -1.342117 -0.698026 -1.324851 -0.580198 -0.427051 -1.299300 -0.420340 -0.304459 -0.409888 -0.379963 -0.146946 1.592774 -0.480085 -0.052593 -0.239782 -1.259791 -0.396663 2.122490 0.028516 -0.994881 1.301265 -1.334408
163 0.520854 -0.505625 0.426573 -0.288458 -0.203506 0.056547 -0.084117 -0.044609 -0.106195 0.425174 0.673866 -0.086971 0.149312 -0.189979 0.023452 -0.394202 -0.197981 -0.450858 -0.294570 -0.105259 0.136053 -0.170685 0.954744 -0.642916 -1.327523 2.270310 0.656350 0.314711 -0.090280 -0.775560 -0.648892 0.423726 -0.104169 -0.114165 -0.988337 0.164102 -0.359547 0.386816 0.571563 1.102205 -0.391402 0.007881 -0.208173 0.205010 -0.211642 1.426239 -0.186998 -0.056440 0.245999 1.443716 -0.396663 0.225609 -1.183242 -0.852348 -0.793435 0.749396
164 -0.271953 0.865741 -0.246596 0.688092 -0.203506 1.463989 -0.084117 -1.333354 -0.038478 -0.626683 -0.845152 -0.369405 0.481110 -0.039049 -0.130952 -0.225289 0.428232 0.492063 0.309623 -0.497194 -0.166308 -0.358074 -0.336927 -0.890281 -1.327523 -0.700937 0.480145 -0.181093 -1.208963 -0.521622 -0.521896 -0.877054 0.067617 -0.114165 -0.971549 -0.698026 1.195890 -0.358253 0.182166 0.054249 -0.216390 -0.299881 -0.009662 0.149918 -0.730038 -1.105095 0.231440 -0.057752 -0.005724 -0.733886 -0.396663 -0.410260 -1.036103 -0.939980 -0.822954 0.749396
165 -0.169656 -0.809758 -0.246596 1.474419 -0.203506 -0.211694 -0.084117 0.592996 -0.106195 -0.157304 0.370063 -0.124501 0.401564 0.394523 -0.130952 0.208769 0.411333 0.557322 -0.724126 0.398658 -0.166308 0.733708 -0.938502 0.142198 0.659080 -0.174507 -0.657500 -0.181093 -0.163238 -0.031875 1.096620 -0.578940 -0.164343 -0.114165 -0.262755 0.564158 0.534779 0.275282 -0.244914 -1.034175 -0.420340 0.133545 -0.320637 -0.150926 -0.292419 -0.096632 -0.016320 -0.050754 0.382901 2.259314 -0.396663 0.173197 -0.479581 0.389608 -0.817086 0.749396
166 -0.570322 0.596824 -0.246596 -0.368927 -0.203506 -0.311043 -0.084117 -0.294238 -0.106195 0.086726 -2.566707 -0.270539 -0.393581 -0.081623 -0.130952 -0.317519 0.624686 -0.474791 -0.865418 -1.924960 -0.166308 0.249354 -0.167022 -0.309512 0.454782 0.874476 -0.213272 -0.181093 1.636383 -0.604474 0.021809 -0.082084 -0.176960 -0.114165 -0.230167 -0.698026 0.033875 -0.446198 -0.106741 1.102205 -0.420340 -0.304459 -0.313787 -0.080358 -0.197489 0.079155 -0.187975 -0.065843 0.245999 -0.946031 -0.396663 -0.769975 -0.221807 -0.712472 -0.797663 0.749396
167 0.657251 -0.620144 -0.207243 0.678848 0.860026 0.172454 0.341150 0.754653 0.025030 -0.180545 1.484010 0.260838 1.240761 0.080371 0.751651 0.209064 -1.172484 -0.292391 0.792163 0.286677 -0.166308 0.081285 1.808184 -0.449327 0.662382 0.347658 -0.248448 0.304729 -0.236195 0.437413 0.387334 0.633211 -0.217917 -0.114165 0.381850 2.240382 -0.768179 0.192624 -0.427051 1.102205 2.737278 -0.036314 -0.009366 -0.379963 -0.447288 0.149469 -0.480085 -0.026354 0.047271 0.196252 -0.396663 0.943211 -1.229060 -0.070024 1.301265 -1.334408
168 -0.161131 0.224800 -0.246596 -0.041999 0.081929 0.665887 -0.084117 0.199004 -0.015256 -0.054350 0.775134 -0.402457 -0.721880 -0.162184 0.022696 -0.309123 -0.071564 -0.033731 0.587390 -0.469199 -0.105070 -0.229972 -0.069292 0.131443 -0.379473 -0.789031 -0.407196 -0.181093 0.371785 1.372832 -0.595631 -0.117893 2.286283 -0.114165 -0.709856 -0.122261 -0.237870 -0.159003 6.173854 1.102205 -0.420340 -0.162278 -0.371990 2.007595 5.255351 1.013602 0.368537 -0.058284 0.144427 0.081712 0.602710 -0.678913 -0.508636 2.399201 -0.820780 0.749396
169 -0.502123 -1.187192 -0.246596 -0.449093 -0.203506 -0.542857 0.079495 0.165921 -0.036350 -0.045685 0.066259 -0.224404 0.225500 -0.206650 0.191239 -0.043125 0.382930 0.179589 -0.335698 -0.693162 -0.166308 0.123104 1.227675 -0.589141 0.728345 -0.163362 -0.779704 -0.129849 0.055635 0.100728 1.293843 -0.766940 0.527656 -0.114165 0.677860 0.812825 -0.854367 -0.002877 1.011205 1.102205 0.903492 -0.036085 -0.354839 0.317672 -0.465265 0.084706 0.154036 -0.052883 0.131178 -0.446867 -0.396663 -0.384961 0.486697 0.578790 -0.823909 0.749396
170 -0.433925 -0.348214 0.328209 -0.323229 0.638243 0.109533 -0.084117 0.665177 -0.106195 -0.043449 0.673866 0.265185 -0.885819 0.208209 -0.130952 -0.303219 0.232442 -0.088436 -0.010258 -0.217240 -0.166308 -0.260277 -0.252550 -0.288002 0.697039 0.769907 3.283097 -0.181093 0.614977 1.052921 0.387978 -0.948673 -0.185695 -0.114165 -0.294726 -0.698026 3.528032 -0.528242 -0.295159 -1.218654 -0.420340 -0.264349 -0.280619 -0.226446 0.099374 -0.153069 -0.134575 -0.056043 -0.239782 -0.269486 -0.396663 -0.348166 0.466954 2.814842 -0.813875 0.749396
171 -0.229329 -1.067907 -0.246596 -0.225742 -0.203506 0.901012 -0.084117 -0.754396 -0.106195 -0.277550 -0.642617 -0.564248 -0.185786 -0.084136 -0.130952 -0.437169 -0.342208 0.616190 -0.666078 -0.581181 -0.166308 -0.584106 -0.146992 0.088423 0.677572 -0.006461 -0.910464 -0.041669 1.587745 -0.386880 0.037916 0.084431 -0.146873 -0.114165 0.658851 0.188421 -0.714438 -0.228931 -0.144425 -0.649300 -0.068622 -0.273646 -0.239665 0.234723 -0.315501 -0.709112 0.397754 -0.060994 0.153259 0.336642 -0.396663 -0.675188 -0.636405 -0.392647 -0.814609 0.749396
172 -0.642783 -0.433503 -0.246596 -0.415424 -0.203506 -0.807787 -0.084117 -0.001000 -0.106195 0.650699 -0.743885 -0.540662 -0.548915 -0.321829 -0.130952 -0.437169 0.888099 -0.176886 -0.498356 0.202691 -0.166308 0.021204 -0.449737 0.002383 0.655151 0.572302 -0.383272 -0.181093 0.225869 -0.616970 0.029934 -0.248150 -0.217917 -0.114165 0.207306 -0.698026 -0.022907 -0.582784 -0.427051 1.102205 -0.420340 -0.301008 -0.501998 -0.379963 -0.339934 0.280847 -0.480085 -0.065843 -0.160290 -0.762410 -0.396663 0.598484 0.955681 -0.495875 1.301265 -1.334408
173 0.000841 -0.334328 -0.246596 -0.271224 -0.203506 -0.450131 -0.084117 0.111784 -0.074781 -0.296301 1.281474 -0.145417 -0.331617 0.242142 0.175006 0.106043 0.663752 0.312487 0.993479 0.118704 -0.166308 1.008970 -0.147913 -0.266492 1.059464 0.669215 0.630625 -0.181093 0.760892 0.187562 0.499511 1.185573 -0.217917 -0.114165 -1.037713 0.436481 -0.341295 -0.194309 -0.427051 0.418714 0.335893 -0.031190 -0.100786 -0.379963 0.062339 0.462185 -0.480085 -0.040978 -0.239782 -0.927312 -0.396663 0.243153 1.579626 -0.246477 1.301265 -1.334408
174 -0.689669 -0.627166 -0.246596 -0.272952 -0.203506 -0.284550 -0.084117 -0.313787 -0.055588 -0.256787 -3.275583 -0.570558 -0.668365 -0.321829 -0.130952 -0.242476 -1.166733 -0.547929 -0.387941 -1.029107 -0.166308 -0.260674 -0.355575 -0.481591 -1.327523 0.168176 -0.238272 -0.181093 0.298827 -0.284373 -1.126737 0.355688 -0.217917 -0.114165 -0.818977 -0.698026 -0.789472 -0.272930 -0.427051 1.102205 -0.420340 -0.304459 -0.430587 -0.379963 1.437185 1.570569 -0.480085 -0.057364 -0.010140 -0.906811 -0.396663 0.382874 0.579823 -0.007779 1.301265 -1.334408
175 0.222486 1.152517 0.093365 -0.077547 -0.203506 2.550202 -0.084117 0.200508 -0.106195 0.334142 -0.642617 -0.104070 -0.068546 -0.080717 -0.130952 -0.374458 0.803885 -1.073129 1.468362 -0.497194 0.070988 -0.626190 1.511195 0.464848 -1.327523 -0.472126 -0.697451 -0.181093 1.028403 -0.602378 0.251747 -0.988064 -0.003232 -0.114165 0.555408 -0.137460 1.078269 -0.391343 -0.332842 0.364207 -0.420340 -0.172139 -0.079496 -0.356440 -0.221315 0.291950 -0.120975 -0.065843 -0.239782 0.901767 0.105725 -0.751601 0.065394 -1.259386 -0.830455 0.749396
176 -0.391301 0.460438 0.021561 0.099328 -0.203506 0.573161 -0.020388 -0.497249 -0.106195 -0.548513 -0.743885 -0.507104 1.221123 -0.098223 -0.130952 -0.437169 0.669989 0.359165 -0.865418 -0.105259 -0.116552 0.224739 0.035115 4.132299 -1.327523 -0.086705 -0.366601 -0.181093 -0.552345 -0.065380 0.229627 -1.717683 -0.171137 -0.114165 0.117194 0.441345 -1.010519 -0.167941 -0.257475 0.483501 0.717935 0.075441 0.061207 -0.004218 0.189969 -0.518521 0.256499 -0.065843 -0.239782 0.623661 -0.396663 -0.398059 0.284427 -0.453143 -0.816056 0.749396
177 2.285491 -0.768810 -0.246596 -0.214728 -0.203506 0.089664 -0.084117 0.123815 -0.106195 -0.364903 1.180206 -0.044278 -0.823896 -0.187004 -0.130952 -0.415981 0.364326 0.494144 -0.416965 0.874580 -0.166308 -0.426360 0.044209 -0.739711 -1.327523 -1.367406 -0.678389 -0.181093 -0.868494 -0.523358 -1.055937 0.233935 -0.095434 -0.114165 -1.081164 0.246180 -0.377798 0.159628 0.835348 -1.377829 -0.420340 -0.294528 0.032031 -0.103261 -0.580835 0.719390 -0.369656 -0.054654 -0.164706 -1.151044 -0.396663 -0.322369 -1.394080 -0.053029 -0.786669 0.749396
178 -0.016209 2.020837 -0.096836 -0.063812 0.410180 -0.056048 -0.084117 0.552394 -0.053370 0.063495 1.180206 0.278136 0.231387 -0.145411 0.017942 -0.290952 0.923248 -0.038042 0.124981 0.678613 -0.147171 0.506617 -0.088976 1.120902 -1.327523 1.707247 -0.695799 -0.181093 0.298827 0.912398 0.940130 0.803306 -0.064377 -0.114165 -0.586169 1.017109 -0.343323 0.071924 0.345462 0.002060 0.539767 4.418754 -0.187130 0.884075 -0.435950 0.680532 0.478872 -0.047386 0.029606 -0.597508 3.238302 -0.158402 -0.870338 1.613520 -0.810045 0.749396
179 3.120922 -0.111921 -0.069975 -0.349144 -0.203506 1.652751 -0.084117 0.594499 -0.106195 0.255043 0.370063 -0.126222 0.119344 -0.103258 -0.130952 -0.365274 0.110333 -0.616460 1.120320 0.034718 -0.112725 0.334844 -0.204779 0.905802 -1.327523 1.342758 -0.641340 -0.181093 0.250189 -0.288566 -0.685401 -0.296045 -0.217917 -0.114165 0.229031 -0.698026 -0.223674 -0.389999 -0.427051 1.102205 -0.379242 0.036000 -0.329163 -0.379963 -0.414523 0.299351 -0.480085 -0.061391 -0.217701 -0.429931 0.094296 -0.012322 -0.244157 -0.818358 1.301265 -1.334408
180 -0.101457 -0.673175 -0.045365 -0.098452 -0.203506 0.261868 -0.084117 2.508819 -0.106195 0.245611 -0.136277 0.191780 0.014286 0.226246 -0.130952 -0.373736 0.006082 -0.502590 -0.088809 -0.189245 0.055679 -0.103457 0.260734 1.067127 0.723022 0.949680 -0.323251 -0.181093 0.469061 -0.374003 -0.607085 0.644849 -0.217917 -0.114165 1.705871 -0.698026 0.926173 0.148433 -0.427051 1.102205 -0.420340 -0.294775 -0.303388 -0.379963 -0.144393 0.371516 -0.480085 -0.029712 -0.089631 -0.504806 -0.396663 -0.293554 1.022359 -0.789614 1.301265 -1.334408
181 0.785123 -1.092828 1.694805 1.105681 -0.203506 0.341347 0.215531 -0.124310 -0.034675 0.563321 -2.262903 0.338524 -0.927208 0.187680 0.386099 -0.043912 -0.989877 2.024698 -0.092143 1.714442 0.116916 0.628897 1.229862 -1.379633 -1.327523 -0.997006 -0.457303 -0.076941 -1.330559 -0.222339 -0.873962 -0.904807 -0.168225 -0.114165 1.125455 0.443776 1.019459 -0.190675 0.043993 -1.026438 1.684508 -0.116851 -0.209307 0.049017 -0.447511 0.155021 -0.173490 -0.051102 0.210670 1.202156 -0.396663 -0.496738 -1.352359 -1.070834 -0.790389 0.749396
182 -0.254904 0.109607 -0.127748 -0.319881 -0.203506 -0.004718 0.028634 1.635119 -0.072155 0.146671 -1.756564 -0.262375 -0.587192 -0.148918 -0.130952 0.259673 -0.018320 -0.294621 -0.021868 -0.007275 -0.166308 0.328756 -0.014153 -1.067738 0.721555 0.472870 0.845680 -0.181093 -0.904973 -0.436142 0.034373 0.414773 -0.217723 0.242827 -0.701832 -0.150836 0.763937 -0.434736 -0.282598 1.102205 0.737946 -0.175361 -0.252183 -0.261730 -0.283782 -0.515746 -0.450770 -0.058197 -0.124961 -0.316728 -0.396663 0.016945 -0.506587 -0.702122 -0.140490 0.749396
183 -0.212280 -0.933759 -0.246596 -0.392553 -0.203506 -0.506429 -0.084117 0.923829 -0.106195 -0.258378 1.180206 -0.230339 -0.697601 -0.134567 -0.130952 -0.279341 0.028225 -0.398680 -0.080657 0.202691 -0.166308 0.195096 -0.158964 -1.600110 0.893572 -0.426093 -0.703604 -0.181093 -0.771218 -0.554322 0.415109 -0.065074 -0.193459 -0.114165 -0.263496 0.632253 -1.079469 0.355487 -0.420771 -0.799086 -0.420340 0.039592 -0.244643 -0.264206 -0.449028 -0.270569 2.274463 -0.044366 0.184172 1.053298 -0.396663 0.053372 1.019006 -1.010897 -0.830315 0.749396
184 0.034940 0.025480 0.878479 -0.351865 -0.203506 -0.609090 -0.084117 1.032101 -0.106195 -0.002239 0.572598 0.267788 0.256849 -0.063505 -0.130952 -0.112921 0.375474 -0.409234 -0.031749 0.370663 -0.166308 1.117486 0.341773 -1.653885 0.908892 0.032886 -0.260491 -0.181093 -1.014410 -0.955498 -1.189948 0.606354 -0.128433 -0.114165 0.783772 -0.698026 0.336040 0.442759 0.295217 -1.409321 -0.420340 0.006560 -0.288702 0.291673 -0.377191 1.740805 -0.064057 -0.053938 0.003109 -0.431714 -0.396663 -0.105462 -0.593195 -0.870532 -0.796567 0.749396
185 -0.561797 -1.594110 -0.246596 -0.307334 -0.037954 -1.821145 -0.084117 -0.413037 0.010410 0.109202 -1.047688 0.118396 -0.582263 -0.321829 -0.130952 0.474242 -0.594935 -1.250474 0.195009 -0.567183 -0.166308 -2.397656 -0.423261 2.029699 0.719938 0.258692 -0.935880 -0.181093 1.891735 -0.028825 1.599055 -0.180560 -0.217917 -0.114165 -0.293121 -0.294321 -0.471084 -0.210253 -0.427051 1.102205 -0.420340 -0.304459 -0.494409 -0.379963 0.340480 -2.200526 -0.480085 -0.052506 -0.239782 0.042492 -0.396663 1.095140 -1.447348 0.740136 1.301265 -1.334408
186 0.375932 -0.423957 -0.104864 -0.211402 0.458133 1.040100 0.076431 -0.246117 -0.106195 -0.152429 0.268795 0.200517 -0.594459 0.019565 -0.130952 0.079017 -0.467631 -0.098247 0.030375 -0.553185 -0.040005 -0.088371 -0.135711 0.593908 0.676185 -0.534441 0.678931 -0.181093 0.055635 1.191454 0.662443 0.244678 -0.217917 -0.114165 -0.852306 0.808569 0.636176 -0.798355 -0.427051 1.102205 0.167115 15.681973 -0.357057 -0.379963 0.064651 1.407735 -0.480085 -0.049481 -0.239782 0.743995 9.886660 0.003986 -0.621877 3.064939 1.301265 -1.334408
187 0.554953 1.730056 0.998752 1.700448 -0.203506 0.453943 -0.084117 -0.402510 -0.106195 -0.383428 -0.338813 0.510421 -0.711262 -0.083570 -0.130952 -0.437169 -0.853614 0.973483 0.203408 0.818590 -0.166308 -0.922096 -0.913178 -0.707446 -1.327523 -1.677236 -0.711059 0.265797 -1.160325 2.141402 -1.343646 -0.758883 -0.217917 -0.114165 -0.300157 0.048584 -0.982127 0.385398 -0.427051 1.102205 -0.420340 -0.242903 -0.322510 -0.379963 0.472972 -0.939485 -0.480085 -0.065843 0.056103 -0.350600 -0.396663 -0.111116 -1.709592 -0.426916 1.301265 -1.334408
188 0.145762 -0.909997 0.105220 0.327819 -0.203506 -1.023043 0.048856 -0.599506 -0.087229 -0.611806 -0.541349 0.146638 -0.666394 -0.189456 0.113212 -0.406469 0.551974 -0.479994 -0.400292 0.342668 -0.166308 -0.148452 1.139844 0.131443 -0.351985 -0.500425 1.193577 -0.181093 -0.260514 -0.458698 -0.441647 1.515916 -0.217723 -0.114165 2.521318 0.123367 0.498276 -0.334969 -0.307720 -1.607749 1.770706 -0.106146 -0.000939 -0.240065 -0.148674 -1.436315 -0.433533 -0.063859 -0.239782 -0.723190 -0.396663 -0.733504 0.032614 -0.456850 -0.255831 0.749396
189 -0.340152 -0.093157 0.124878 -0.396311 -0.203506 -0.132215 0.011476 1.057665 -0.040379 0.522573 -0.035009 -0.136216 -0.452109 -0.129267 -0.130952 -0.046011 -0.586485 -1.250474 -0.713999 0.202691 0.017405 0.321345 1.757650 -0.244982 0.621650 -0.790485 -0.155186 -0.181093 0.542019 -0.406238 -0.165177 0.067421 -0.217917 -0.114165 -0.149190 -0.508333 -0.252066 -0.046204 -0.427051 1.102205 0.666370 -0.304459 -0.168599 -0.379963 -0.368366 -2.200526 -0.480085 -0.039043 -0.071966 0.061210 -0.396663 -0.337936 -0.327599 -0.111637 1.301265 -1.334408
190 0.162812 0.019758 -0.246596 0.824668 -0.203506 -0.248122 -0.035095 0.377954 0.043454 0.132329 1.889081 -0.028149 2.648425 -0.156285 0.217652 -0.177206 -0.447855 0.127263 -0.695720 0.426654 -0.078279 0.964240 1.103584 -0.685936 0.826993 0.242023 -0.924206 0.163974 -0.211876 -0.023191 0.050587 0.980563 1.736955 -0.114165 0.193974 0.671164 -0.852339 0.266813 2.706966 -0.807968 0.860316 0.033746 -0.322756 2.911363 -0.376464 0.634272 1.713091 -0.055613 0.228334 0.880374 -0.396663 -0.282509 1.268957 0.115871 -0.825378 0.749396
191 -0.289003 -0.238933 0.341264 -0.357740 -0.203506 2.401179 -0.084117 0.361413 -0.106195 -0.562501 -0.946420 0.278136 -0.222148 0.118745 -0.130952 -0.373211 0.241223 -1.093048 1.330775 -0.413208 -0.082106 -0.559492 -1.506004 0.260503 -1.327523 0.521035 0.478613 -0.181093 0.006997 -0.543288 0.158111 -0.010465 -0.117757 -0.114165 0.347534 -0.666410 0.761909 -0.151636 -0.043935 0.108848 -0.420340 -0.174058 -0.184123 -0.318061 0.501397 -0.052223 -0.383666 -0.065843 0.042854 -0.034166 0.000169 -0.382312 0.107487 -1.282326 -0.815343 0.749396
192 7.894818 0.691151 9.684524 8.237573 1.074102 2.123002 -0.084117 0.540363 18.732660 13.709010 1.382742 3.032256 1.746560 -0.157993 20.193747 -0.311287 13.197043 2.022171 15.417188 1.126539 0.296802 -0.271790 0.428798 2.443766 -1.327523 -1.426426 1.202913 0.681074 -2.157412 1.650724 0.491636 0.602773 -0.217917 -0.114165 -0.635792 4.175609 -1.020658 0.329454 -0.427051 0.245234 0.854621 -0.009885 1.406976 -0.379963 1.569991 -1.143028 -0.480085 -0.065843 0.572797 10.991115 1.700244 -0.750487 1.179556 -0.360370 1.301265 -1.334408
193 -0.178180 -0.603898 -0.246596 -0.439245 -0.203506 1.430872 -0.036320 -0.405518 -0.085916 0.498359 -1.351492 0.311253 -0.915576 -0.023595 -0.130952 -0.382592 -0.148323 -0.140317 -0.091279 -0.217240 -0.166308 -0.512115 -0.221815 0.841272 -1.327523 0.624635 0.221633 -0.181093 1.393191 -0.516962 -0.602861 0.047726 -0.217917 -0.114165 0.683539 -0.069366 -0.590733 -0.140646 -0.427051 -1.265621 0.183508 -0.296254 -0.333056 -0.379963 0.719492 3.082338 -0.480085 -0.065843 0.069352 -0.011881 -0.396663 5.251124 0.000206 -0.540565 1.301265 -1.334408
194 -0.459499 -0.251285 -0.246596 -0.199524 -0.203506 -0.324290 -0.084117 -1.739376 -0.106195 0.456393 0.572598 0.664887 -0.462531 -0.065711 -0.130952 0.262296 0.298893 -0.122775 -0.085350 -0.413208 -0.166308 1.920774 -0.342107 -0.610651 0.632394 -0.226937 3.519148 -0.181093 0.371785 -0.574654 0.263416 0.089802 -0.054671 -0.114165 -0.331635 -0.549676 0.329956 -0.137846 0.043993 -1.403770 -0.298355 -0.304459 0.448324 0.228533 -0.538110 0.291950 0.313588 -0.053740 0.069352 -0.155391 -0.396663 0.761858 -0.463935 -0.674077 -0.819566 0.749396
195 0.009365 0.916458 -0.246596 0.318230 0.120178 -0.347471 -0.084117 0.472693 -0.106195 -0.507132 0.471331 -0.269568 2.648425 -0.321829 -0.130952 0.124935 -0.893520 -0.382774 -0.560357 -0.581181 -0.166308 -1.143099 -0.645888 -0.083657 0.672971 -0.292063 -0.686369 -0.073613 -1.208963 0.777592 0.642685 0.404030 0.473305 -0.114165 -0.842924 -0.226227 -1.040938 0.097251 0.621808 1.102205 -0.151895 -0.057003 -0.351439 -0.170115 0.318227 -0.507419 -0.213255 -0.058090 -0.208868 -0.212438 -0.396663 -0.476440 -0.200202 0.950020 -0.825739 0.749396
196 -0.493599 -1.594110 -0.246596 -0.211812 0.132166 -0.281239 0.107070 -0.553641 -0.050518 -0.204042 -0.338813 -0.276518 -0.777169 -0.130368 -0.130952 -0.231586 1.125121 -0.387828 0.418309 -0.371215 -0.166308 0.116090 -0.798756 0.163708 0.653957 0.499424 -0.414538 -0.181093 -0.138918 -0.240278 0.249958 -0.489864 -0.217917 -0.114165 -0.452606 -0.698026 -0.191227 -0.837864 -0.427051 0.254116 -0.420340 0.072271 -0.307331 -0.379963 5.379274 0.484390 -0.480085 -0.065843 -0.058718 -0.167870 -0.396663 2.756591 -0.041515 1.998213 1.301265 -1.334408
197 1.390385 -0.518012 0.258581 1.167361 -0.203506 -0.688569 -0.084117 -0.700260 -0.106195 -0.913574 -0.237545 -0.503618 -0.829839 0.612332 -0.130952 0.107159 0.071161 3.247537 0.124734 1.882415 -0.043833 8.085044 -1.227779 -0.019127 0.793474 -0.797269 -0.118849 -0.181093 -0.187557 1.358282 0.586776 -0.640712 -0.217723 -0.114165 2.100633 -0.546637 1.524418 -0.201651 -0.383087 -1.298571 0.018808 0.290675 -0.415260 -0.331061 -0.466076 0.110611 -0.422869 -0.019613 -0.239782 0.801934 -0.396663 -0.726270 -2.206513 -0.138703 -0.601852 0.749396
198 0.086089 1.099497 0.052511 -0.071759 -0.203506 -1.013108 -0.084117 1.191502 -0.106195 0.533923 -0.642617 -0.171706 -0.965739 -0.176198 -0.130952 -0.195672 0.279449 -0.317365 -0.179709 0.762599 -0.062969 0.725239 2.384895 -0.890281 0.696865 -0.987411 -0.349802 -0.181093 -0.600983 -0.154735 -1.052787 -0.190407 -0.209958 -0.114165 2.078661 -0.558796 1.490957 0.275125 -0.219792 -0.164863 -0.420340 -0.300568 -0.114388 -0.223351 -0.438920 0.778602 -0.374206 -0.064599 0.047271 1.087616 -0.362519 -0.761605 1.695103 0.690131 -0.723600 0.749396
199 -0.374251 0.273967 0.451108 -0.331349 -0.203506 -0.248122 -0.084117 0.692246 -0.106195 -0.276349 0.572598 0.099091 -0.440111 -0.148689 -0.130952 0.013485 -0.201365 -0.231888 1.123284 -0.217240 -0.093588 0.405512 -0.146532 -0.879526 0.762753 0.026877 0.201469 0.462455 -0.163238 -0.360702 0.241510 -0.344388 -0.217917 -0.114165 0.896103 0.086279 0.745685 -0.369330 -0.427051 1.102205 -0.420340 -0.169339 -0.314625 -0.379963 -0.490589 -0.588836 -0.480085 -0.051073 -0.213284 0.106670 -0.396663 -0.541305 0.844674 -0.454122 1.301265 -1.334408
200 0.444131 1.045002 1.096141 -0.022022 -0.203506 -0.700159 0.142611 0.359157 8.167960 6.279828 1.180206 1.034660 1.092961 2.601779 0.988672 0.847852 2.120241 0.089802 2.343529 1.000560 0.536012 -0.096311 -0.759848 -1.175288 0.684001 -1.004662 -0.532039 -0.181093 -2.351966 -0.986462 -0.915268 -0.122369 -0.109022 -0.114165 0.314575 -0.136244 -0.939540 -0.165426 -0.182108 0.522705 0.033892 -0.018372 -0.337590 -0.104499 -0.494685 1.641809 0.106513 0.148201 -0.239782 0.059428 -0.396663 -0.724829 3.457982 -0.781711 -0.822326 0.749396
201 -0.459499 0.267206 -0.246596 -0.353680 0.138445 -0.016308 -0.084117 -1.739376 -0.075279 -0.419554 0.673866 0.135076 -0.112319 -0.200008 -0.130952 -0.282162 0.160772 -0.270390 -0.660273 -0.805144 -0.166308 -1.103133 0.554270 -0.664426 0.699489 -0.068776 -1.008433 -0.181093 -1.063048 0.241421 -0.294392 -0.817074 3.042338 -0.114165 0.882524 0.189637 -0.688075 -0.355922 1.815121 1.102205 -0.420340 -0.191384 -0.268989 3.406579 0.015914 0.576910 4.082878 -0.056624 0.250415 0.306336 -0.396663 -0.593720 -2.042611 0.109647 -0.829461 0.749396
202 0.026415 -0.326015 -0.246596 -0.364175 -0.203506 -0.715062 -0.082891 0.818564 -0.106195 -0.418237 -0.035009 0.002673 -0.705150 0.381475 -0.130952 -0.334312 -0.370765 -0.447885 -0.666449 0.314672 0.074816 0.604812 0.349025 -0.599896 -0.350076 0.365200 -0.714028 -0.181093 -1.136006 0.289265 3.690297 0.061154 -0.182783 -0.114165 0.589230 -0.411663 0.190027 0.003199 -0.156986 1.102205 -0.420340 -0.298631 -0.206596 -0.184353 -0.515467 0.950689 -0.264763 -0.065843 -0.094047 -0.517285 -0.300591 -0.734590 0.607016 1.468539 -0.798243 0.749396
203 -0.536223 1.040771 -0.134426 -0.245956 -0.203506 -0.672010 -0.024065 -1.739376 -0.106195 -0.189673 -0.237545 0.035879 -0.984272 0.035883 -0.130952 -0.102032 0.322053 -0.199779 -0.184032 0.314672 -0.166308 -0.166185 -0.123279 -0.288002 -1.327523 -0.565550 0.549797 -0.181093 -0.211876 -0.986462 -0.268907 -0.649664 0.035590 -0.114165 1.565396 -0.698026 2.619510 0.126186 -0.018813 -1.425536 -0.066544 -0.304459 -0.212609 0.008162 -0.269746 -0.679505 0.264678 -0.061231 -0.239782 0.393689 0.651057 -0.661258 -0.643483 -0.701143 -0.825269 0.749396
204 0.128713 0.698022 0.134482 -0.251355 0.513793 0.400957 0.160994 -0.480707 -0.106195 -0.295224 0.572598 -0.569720 0.869171 0.707726 0.268511 -0.068642 0.041409 -0.499765 -0.100418 -0.273231 -0.166308 0.082609 0.511218 0.228238 -1.327523 0.152185 -0.646176 -0.181093 1.879575 0.010801 4.486777 0.368221 0.798438 -0.114165 -0.069447 0.240708 -1.176810 -0.080106 1.676948 -1.129694 1.073502 -0.296201 0.021731 1.051829 -0.608560 3.583794 0.547370 -0.056067 0.148843 0.626335 -0.396663 -0.678933 0.596586 0.205811 -0.823288 0.749396
205 -0.161131 0.262828 -0.095560 0.482579 -0.193230 -0.142150 -0.084117 -0.300253 -0.043593 -0.002888 -0.136277 -0.243026 2.648425 0.000825 -0.130952 0.446232 0.372799 1.267003 0.411516 -0.273231 -0.166308 2.209269 -0.447435 0.217483 0.699992 -0.393821 -0.530192 -0.181093 2.317321 0.632494 -0.253873 1.045916 -0.099316 -0.114165 -0.872056 -0.409231 1.901616 -0.159507 0.069116 1.102205 -0.420340 -0.125197 1.067761 0.175916 -0.435590 -0.350136 0.105939 -0.051044 -0.239782 -0.927312 -0.293944 -0.212234 0.707592 0.825460 -0.813779 0.749396
206 -0.271953 -0.513375 -0.246596 -0.411731 -0.203506 1.689179 -0.084117 -0.161905 -0.106195 0.080466 0.876402 -0.400625 -0.779141 -0.041476 -0.130952 -0.354385 1.373647 -0.569187 1.036583 0.370663 -0.166308 -0.986941 -0.299401 -0.379419 0.749174 2.013588 2.038361 -0.181093 -0.284834 -0.468377 -0.536464 -0.976873 -0.217917 -0.114165 -0.115861 -0.566700 -0.333184 -0.015189 -0.427051 1.102205 -0.420340 0.029415 -0.116803 -0.379963 -0.344856 2.171946 -0.480085 -0.065843 -0.239782 1.273465 -0.015851 0.874221 -0.016930 -0.276760 1.301265 -1.334408
207 0.657251 0.514617 0.577984 -0.077072 -0.203506 -0.277927 -0.084117 2.268213 0.037931 0.657267 1.686546 0.701579 -0.324829 -0.135038 -0.130952 0.116736 0.421309 -0.544362 -0.282343 -0.077263 -0.116552 -0.633600 0.128471 -0.911791 -1.327523 0.820205 -0.584100 -0.181093 -0.382110 -0.034946 0.037415 1.686906 -0.117757 -0.114165 0.637619 0.705819 1.938119 0.146366 0.458513 -1.141988 -0.420340 0.013938 -0.060423 0.131347 -0.026069 1.812971 -0.208284 -0.055356 0.197421 0.415973 -0.123113 0.628699 -1.074843 -1.288760 -0.793162 0.749396
208 -0.689669 -0.209472 -0.246596 -0.432550 -0.203506 -0.118969 -0.084117 -1.739376 -0.106195 -0.166782 -0.237545 -0.031834 0.250793 -0.019028 -0.130952 -0.201412 -0.621326 -0.018122 0.390273 -0.945121 -0.166308 0.239032 -0.164144 -0.825751 -0.513581 -0.213466 -0.893175 -0.181093 -1.184644 0.100219 0.057674 -0.779474 0.898404 -0.114165 -0.303490 -0.291889 -1.288348 -0.463043 2.895384 -1.219001 -0.420340 -0.304459 -0.336358 1.728417 -0.226758 0.423327 0.517815 -0.038626 -0.032221 0.238592 -0.396663 -0.668890 -1.760625 0.520811 -0.819255 0.749396
209 -0.297528 -0.706154 -0.246596 0.618724 -0.203506 0.351282 -0.084117 -0.280704 -0.106195 -0.003684 -0.440081 -0.048602 -0.601838 0.148410 -0.130952 0.148485 0.368326 0.499347 -0.045828 -0.301227 -0.166308 0.117281 0.030280 -0.524611 0.780212 0.714376 -0.664969 -0.181093 -0.673941 -0.008197 -0.725991 0.293916 -0.217917 -0.114165 -1.106840 -0.096117 -0.420385 0.186427 -0.427051 1.102205 -0.420340 0.029168 -0.344835 -0.379963 -0.136460 -0.348286 -0.480085 -0.065843 -0.239782 0.629901 -0.396663 2.051037 -0.479208 1.205572 1.301265 -1.334408
210 -0.425400 -0.218053 0.034016 -0.286212 -0.203506 1.328212 -0.084117 1.024582 -0.106195 0.290820 0.572598 0.250115 -0.119276 -0.181260 -0.130952 -0.372752 -0.443299 -0.457993 -0.002354 0.118704 0.025060 -0.137600 -0.750179 -0.180452 0.852796 -0.625539 0.526987 -0.181093 -0.795537 -0.986462 -0.778608 -0.796483 -0.217917 -0.114165 -0.480750 -0.698026 -0.432553 -0.491165 -0.427051 1.102205 0.930814 -0.109614 -0.312308 -0.379963 -0.467850 -0.921907 -0.480085 -0.045276 -0.239782 1.854635 -0.396663 0.009044 -0.986560 -0.671489 1.301265 -1.334408
211 0.086089 0.766083 0.235397 -0.270447 -0.203506 -0.122280 -0.084117 -1.739376 -0.106195 0.904612 0.977670 0.643508 -1.003960 -0.193695 -0.130952 -0.409815 0.321390 0.415505 -0.190331 -0.217240 -0.093588 0.068581 0.718074 -0.438572 -1.327523 0.898026 0.327152 -0.181093 -0.819856 -0.664284 -0.930086 -0.613855 -0.084758 -0.114165 0.520104 0.164102 0.329956 -0.156579 0.270095 -1.221153 -0.420340 -0.304459 -0.110741 0.421048 -0.353811 2.059072 0.149717 -0.055433 0.170924 2.708561 -0.396663 -0.584063 -0.094038 0.335056 -0.809131 0.749396
212 -0.578847 -0.766935 -0.246596 -0.398859 -0.203506 -1.029666 -0.084117 -0.488226 -0.106195 -0.484905 -2.161636 -0.685267 -1.023873 -0.063096 -0.130952 -0.323882 -0.530697 -0.595499 0.193651 -0.301227 -0.166308 0.081021 -0.321042 0.163708 -1.327523 -0.660815 1.036890 -0.181093 -0.114599 -0.352951 -0.396117 -1.213664 -0.217917 -0.114165 -0.086235 -0.698026 -0.684019 -0.322580 -0.427051 -1.030890 0.216140 -0.296289 -0.098568 -0.379963 -0.100590 0.519548 -0.480085 -0.065843 -0.067550 -0.884527 -0.396663 1.606678 0.123132 0.334986 1.301265 -1.334408
213 0.213961 -0.298118 -0.246596 -0.355148 -0.203506 -0.473313 -0.057155 -0.182958 -0.106195 0.110827 -0.237545 -0.437802 -0.980328 -0.034680 -0.130952 -0.400303 -1.288794 -0.438073 -0.104741 0.006723 -0.166308 -0.598928 0.096239 -0.255737 -1.327523 0.464923 -0.814496 -0.181093 -0.795537 -0.434660 -0.745105 -1.525206 0.636357 -0.114165 -0.412118 0.953879 -0.852339 -0.217762 1.274990 1.102205 -0.420340 -0.083766 -0.202703 0.930501 2.218186 0.604666 0.587737 -0.061187 -0.023388 0.760040 -0.258748 -0.686358 -0.254215 -0.555252 -0.823631 0.749396
214 -0.485074 1.865249 -0.246596 -0.478507 -0.203506 -0.327601 -0.084117 -0.793495 -0.106195 -0.668787 -2.566707 -0.251344 -0.378343 -0.017172 -0.130952 -0.294888 -0.906704 -0.164696 -0.363734 -0.917126 -0.158653 -0.701887 -0.823851 -0.094412 -1.327523 0.534991 -0.837601 -0.181093 0.420423 0.180383 0.440523 0.124716 -0.183754 0.086643 -0.653814 -0.099765 -0.477168 -0.321635 -0.144425 -0.766864 0.180891 -0.304459 -0.041499 0.083064 0.058691 -0.586986 -0.019436 -0.065843 -0.239782 -1.189373 0.172850 -0.167930 -1.984128 -0.613930 -0.795688 0.749396
215 -0.485074 0.808808 0.035442 -0.352557 -0.203506 -0.748178 -0.084117 -0.420556 -0.106195 0.041762 -1.250224 -0.535610 0.312927 -0.166200 -0.130952 -0.300660 -0.773495 -0.758723 -0.250108 -0.693162 -0.166308 -0.593105 -0.219628 -0.556876 -1.327523 0.949971 0.157072 -0.181093 1.004084 -0.693194 -0.510227 -0.519855 -0.099316 -0.114165 0.049796 -0.698026 1.467635 0.013673 0.006310 -1.114855 -0.420340 -0.304459 -0.270763 0.241532 1.412135 1.061712 0.269074 -0.060641 0.201837 -0.811435 -0.378794 -0.464447 -0.168539 -1.043978 -0.816123 0.749396
216 -0.220805 -0.176832 1.813728 -0.028696 2.033736 0.460566 -0.084117 -0.134837 -0.106195 0.121239 -0.035009 0.061406 -0.606350 0.084048 0.152824 -0.359305 -0.602391 -0.263701 -0.865418 -0.021273 1.831573 -0.440653 0.572227 -0.288002 1.131433 -0.697061 4.966270 0.976893 2.584832 0.384550 -0.378578 0.009231 0.064706 1.013603 -0.689858 -0.698026 0.192055 0.080713 0.471074 0.577801 0.754569 0.254403 -0.248980 0.179631 2.012142 0.247540 0.022397 -0.057926 -0.023388 0.452519 0.175623 -0.298691 0.045651 -0.257247 -0.818313 0.749396
217 0.162812 -1.134239 -0.246596 0.279918 -0.203506 -0.238187 -0.084117 0.944882 0.038112 0.251919 0.167527 -0.198921 0.565826 -0.095721 -0.130952 -0.204823 0.088060 0.183900 0.223169 -0.105259 -0.166308 -0.118544 0.434783 -1.944270 0.644792 -0.751913 -0.334065 -0.181093 -0.868494 -0.528399 -0.551318 2.098715 -0.057583 -0.114165 -0.468900 0.410945 1.021486 0.421352 0.207289 -0.283431 -0.420340 -0.304459 -0.086790 0.130109 -0.323757 0.362264 0.009514 -0.051344 0.113513 0.522937 -0.396663 0.041821 0.357065 -1.016702 -0.814815 0.749396
218 0.358883 0.761230 0.523737 0.059634 0.497808 0.755301 -0.084117 -0.158897 -0.106195 0.567083 1.281474 0.348828 1.190795 -0.321829 -0.130952 0.131954 3.445308 0.085193 -0.049410 0.398658 -0.166308 0.064346 -0.088861 0.163708 -0.643541 -1.825900 -0.731666 0.628000 -0.576664 0.896958 0.264561 1.178411 -0.144932 -0.114165 -0.861687 -0.698026 -0.862478 -0.147823 -0.069057 -0.303278 0.451337 0.067852 -0.325319 -0.136069 -0.681442 -0.709112 -0.190743 -0.046142 -0.147042 -0.122411 0.178285 -0.564049 0.846164 1.637438 -0.810527 0.749396
219 -0.433925 -0.678325 -0.246596 -0.086315 0.029409 -0.254746 -0.084117 0.257652 -0.080394 -0.973301 -1.959100 -0.722797 -0.713318 0.084224 -0.130952 -0.232177 -0.237709 -0.513887 0.247129 -0.609176 -0.166308 1.131779 -0.482889 -0.309512 0.770749 0.324593 -0.971586 0.130365 -1.208963 1.116840 0.881787 0.296602 -0.217917 -0.114165 -0.301762 -0.698026 -0.505559 2.415644 -0.427051 -1.136529 -0.420340 -0.267518 0.087031 -0.379963 0.366078 1.633483 -0.480085 -0.065843 -0.116128 -0.212438 0.345196 0.939046 -0.248255 3.795999 1.301265 -1.334408
220 -0.553272 -1.124011 -0.246596 -0.470128 0.081358 -0.840903 0.260877 -1.393505 -0.051922 -0.119584 -1.554028 -0.605860 -0.554999 -0.271961 0.423184 -0.379312 -0.392576 -0.329109 -0.699919 -0.077263 -0.166308 0.213887 1.743491 0.540133 -0.275216 -0.099400 -0.839200 0.214386 1.490468 0.091621 0.511895 -0.504636 -0.182783 -0.114165 0.218909 0.499711 -1.308627 -0.488229 -0.219792 -1.139894 1.561368 0.050174 -0.307971 -0.189305 -0.415694 0.317855 -0.203073 -0.057640 0.113513 -0.703580 -0.293482 -0.638176 0.588018 -0.172273 -0.806036 0.749396
221 -0.067358 -0.375096 -0.079579 -0.110244 -0.203506 0.175766 -0.084117 0.217049 -0.106195 0.012705 0.876402 -0.234311 -0.471544 -0.149472 -0.130952 -0.211251 0.394409 -0.218212 0.235272 -0.385213 -0.166308 -0.916802 -0.292379 -0.610651 0.665820 -0.719738 1.350613 -0.181093 -0.722579 -0.132730 -0.072042 -0.449131 -0.142991 -0.114165 0.123366 -0.533869 0.220447 -0.077947 -0.357965 -1.076828 -0.420340 -0.304459 -0.211722 -0.208495 -0.513667 0.599114 0.606975 -0.049709 0.025190 -0.062689 -0.396663 -0.703254 0.959033 0.869731 -0.828097 0.749396
222 -0.416875 -0.683779 -0.175317 -0.233301 -0.203506 1.285161 -0.084117 0.068175 -0.106195 -0.387517 -0.237545 -0.411944 0.963976 -0.221844 -0.130952 -0.091667 0.350941 -0.710410 0.160428 -0.217240 -0.166308 -0.982309 -1.191748 -0.427817 1.021866 -0.457977 -0.581790 -0.181093 -1.063048 -0.314574 0.154603 -0.474198 -0.121639 -0.114165 -0.415327 -0.698026 -0.795556 0.209991 -0.320281 -0.823796 -0.420340 -0.238554 -0.175302 -0.316823 -0.569986 -0.748895 -0.109526 -0.065843 -0.239782 -0.518176 0.732273 -0.627107 0.146228 -0.633443 -0.827343 0.749396
223 -0.352939 0.410531 -0.246596 -0.401861 -0.203506 -0.335880 -0.084117 1.035109 -0.106195 0.120888 0.370063 0.190114 -0.818601 -0.140737 -0.130952 0.091940 0.157032 0.621541 -0.050028 -0.525190 -0.131862 -0.372764 0.227582 1.922149 -1.327523 0.400573 0.213754 -0.181093 0.651455 0.751161 0.628654 -0.723074 -0.081846 -0.114165 -0.226587 0.165318 -0.278429 -0.121618 -0.119302 0.550190 -0.420340 0.789284 0.199347 0.023019 0.004800 0.261418 0.266719 -0.041985 -0.239782 -1.154610 2.261946 -0.043623 0.170068 0.871724 -0.822308 0.749396
224 -0.152606 -0.390568 -0.246596 -0.162767 0.494669 0.233719 -0.084117 0.690742 -0.073694 0.586900 -0.440081 -0.271929 -0.760467 -0.101445 -0.130952 -0.437169 -0.292657 -0.517975 -0.483288 -0.357217 0.067161 -0.738279 0.012783 1.454307 0.849290 -0.282081 -0.318872 -0.181093 1.952533 0.201646 -0.135826 0.474754 -0.217917 -0.114165 0.283345 -0.267570 1.585256 0.140664 -0.427051 -0.823639 -0.420340 -0.298543 -0.066929 -0.379963 -0.236684 -0.105884 -0.480085 -0.065843 -0.239782 1.562267 -0.396663 0.775115 -0.733629 -0.518780 1.301265 -1.334408
225 -0.306053 0.606259 -0.246596 -0.416050 0.339106 -0.410392 -0.084117 -0.175439 -0.106195 0.075749 0.572598 -0.115477 2.648425 0.038617 -0.130952 0.052188 -0.320220 -0.206617 0.699533 -0.609176 -0.166308 -0.782877 -0.005059 -0.513856 0.578873 0.503591 1.099476 -0.181093 0.250189 0.090731 0.740831 -0.877950 0.520862 -0.114165 0.656382 0.449856 -0.248010 -0.116342 0.282656 -1.147354 -0.420340 -0.183460 -0.272783 0.232247 -0.458260 -0.712812 0.657066 -0.052975 0.060519 -0.418343 -0.115047 0.193088 -0.533594 1.582607 -0.828219 0.749396
226 -0.672620 -0.693603 -0.246596 -0.386117 -0.203506 0.112845 -0.084117 0.409534 -0.106195 -0.089253 -3.073047 -0.513149 -0.831529 -0.147789 -0.130952 -0.327490 0.178358 -0.449520 -0.529480 0.118704 -0.166308 -0.525349 -0.391260 0.722967 0.987781 4.801573 -0.635093 -0.181093 1.174318 0.565885 -0.743029 -1.140254 -0.217917 -0.114165 -0.024021 1.498028 0.234642 0.108944 -0.427051 0.065577 -0.420340 -0.066229 -0.405354 -0.379963 -0.345013 -0.037420 -0.480085 -0.065843 -0.204452 -0.397842 0.508328 1.278297 0.183851 1.772349 1.301265 -1.334408
227 -0.583109 -0.560031 -0.246596 -0.463930 -0.203506 -0.609090 -0.084117 -0.507775 -0.054909 0.164180 -2.262903 -0.342312 -0.134739 -0.129118 -0.130952 0.685432 -0.732440 -0.300121 0.087065 -0.721158 -0.166308 1.033717 -0.265558 -0.067524 -1.327523 2.084528 0.766020 -0.181093 0.055635 -0.570101 0.313563 -0.550740 -0.217917 -0.114165 -0.582589 -0.144756 -0.191227 -0.179889 -0.427051 0.160818 -0.420340 -0.304459 -0.444386 -0.379963 -0.506021 0.266969 -0.480085 -0.049070 -0.094047 -0.967869 -0.396663 -0.605179 -0.566560 0.412268 1.301265 -1.334408
228 -0.553272 -0.140743 -0.059884 0.200702 -0.203506 -0.575973 -0.084117 -1.123576 -0.106195 -0.368323 -0.338813 -0.229953 -0.637975 0.402186 -0.130952 -0.260351 0.017053 -0.584499 -0.315196 -0.245236 -0.166308 0.403659 -0.083335 -0.610651 -1.327523 -0.972487 -0.597030 -0.181093 -1.306240 -0.172695 -0.232827 0.138145 -0.158520 -0.114165 0.711436 -0.698026 1.097535 -0.250320 -0.106741 -1.278794 -0.420340 -0.300497 -0.121041 -0.092119 -0.461122 -0.498167 -0.139161 -0.054175 -0.239782 -0.049319 0.159396 -0.751644 0.211788 1.072201 -0.808349 0.749396
229 2.132044 1.112246 -0.246596 8.237573 0.728726 0.382743 -0.084117 0.183966 0.325279 -0.468108 0.775134 -0.383085 2.648425 -0.321829 -0.130952 -0.189670 -0.907213 3.049751 -0.219973 1.826424 -0.001732 -0.953592 0.028668 0.260503 1.009095 -1.450267 -0.903237 0.083447 -0.017322 1.489000 -0.104256 0.268850 0.605299 -0.114165 -0.302256 -0.209203 -0.866534 0.680378 0.722298 1.102205 -0.373701 0.201862 -0.401165 -0.011646 1.215086 -1.025528 -0.028366 -0.065843 -0.239782 -1.026254 -0.396663 -0.573377 1.025712 0.660092 -0.826284 0.749396
230 -0.433925 -0.881536 -0.246596 -0.438467 -0.203506 -0.811099 -0.084117 1.041124 -0.106195 0.344886 -0.136277 -0.201105 -0.966302 -0.321829 -0.130952 -0.253037 -0.185637 -0.174805 0.435600 -0.189245 -0.166308 0.628897 -0.850787 1.766201 0.898285 0.773105 0.390813 -0.181093 0.906807 0.334779 -0.055720 -1.469702 -0.217917 -0.114165 0.295936 -0.385520 -0.156752 -0.374908 -0.427051 1.102205 -0.420340 0.041282 -0.304965 -0.379963 -0.345797 0.066202 -0.480085 -0.065843 -0.239782 -0.223135 -0.396663 0.146037 -0.017302 1.300758 1.301265 -1.334408
231 -0.459499 -0.928763 -0.246596 -0.389314 -0.203506 -0.440197 -0.084117 -0.148371 -0.056131 -0.370525 -0.642617 -0.402655 0.137398 -0.050709 -0.130952 -0.217614 -0.488957 0.385477 -0.202558 -0.413208 -0.166308 -0.907803 -0.222506 -0.277247 0.467771 -0.561480 -0.059823 -0.181093 -0.795537 -0.828317 0.513971 0.204392 -0.110963 -0.114165 -0.347929 -0.521709 0.603729 -0.431960 0.320339 -0.877106 -0.159283 -0.170712 -0.284907 0.149299 -0.480963 -0.746120 -0.132403 -0.056416 -0.089631 -0.772215 -0.396663 -0.746603 -0.744804 -0.949981 -0.801360 0.749396
232 -0.604421 -0.651509 -0.246596 -0.341542 -0.081054 -0.301108 -0.055316 0.872700 -0.106195 -0.452702 0.268795 0.146340 -0.285439 -0.183335 0.006751 0.057305 0.325769 -0.179562 -0.123761 -0.469199 -0.124207 -0.175846 0.288706 0.271258 0.747321 -0.850086 0.518356 -0.181093 1.356712 -0.184301 0.481542 -0.951807 -0.217917 -0.114165 -0.715904 0.125191 -0.577551 -0.164688 -0.427051 1.102205 0.195899 0.442117 -0.312998 -0.379963 -0.580325 -0.375116 -0.480085 -0.045377 -0.049886 -1.066365 0.031603 0.580811 1.368416 0.455699 1.301265 -1.334408
233 -0.574584 0.461618 -0.246596 -0.369035 -0.203506 -0.215006 0.042115 -1.739376 -0.106195 0.243682 0.066259 -0.277026 -0.288275 -0.264907 0.264791 -0.437169 -0.224881 -0.443722 -0.362993 0.048716 -0.166308 0.515881 -0.037406 -0.427817 1.085248 0.348337 0.544517 -0.181093 -0.856335 -0.609197 -0.684650 -0.967473 -0.217723 -0.114165 0.273223 -0.359984 -0.613041 -0.337901 -0.383087 -0.478199 0.252235 -0.173230 -0.375834 -0.379963 -0.436025 -0.751671 -0.468161 -0.038249 -0.239782 0.220765 -0.396663 -0.454862 1.555786 -0.820456 -0.601852 0.749396
234 -0.450975 1.447330 2.557050 -0.471380 0.864878 -0.215006 -0.057767 0.955408 0.001266 0.567790 0.167527 0.419475 -0.189521 -0.048441 0.231404 -0.437169 1.028232 -0.251808 -0.433268 -0.245236 2.486051 0.478297 0.457691 -0.707446 1.074983 1.038646 0.056188 1.116484 -0.357791 0.152468 -0.270482 -0.749931 0.137497 4.007461 -0.657270 0.089319 0.585477 0.036855 0.898154 -0.334180 -0.127421 0.167686 -0.352030 0.602421 -0.297613 0.169824 0.181632 -0.040736 0.126762 0.538981 -0.396663 -0.305351 -1.236138 2.542993 -0.815876 0.749396
235 -0.450975 0.188029 0.054950 0.519380 -0.203506 -0.705127 -0.084117 0.087724 -0.106195 -0.377873 0.673866 0.050551 -0.958444 -0.105689 -0.130952 -0.417883 0.104984 -0.035961 -0.716222 -0.105259 0.044197 -0.397246 0.298606 -1.288216 -0.347589 -0.194568 -0.654787 -0.181093 -0.698260 -0.091007 -0.819770 -0.276350 0.154967 -0.114165 1.413565 -0.274866 0.384710 7.493128 0.835348 -0.985785 -0.420340 -0.113259 -0.057023 0.411144 0.065237 1.581672 0.094809 -0.065843 -0.098464 -0.244527 -0.325889 -0.668188 0.848399 1.057724 -0.817404 0.749396
236 -0.144081 -0.789325 -0.246596 0.070346 0.113327 0.586408 -0.084117 0.498257 -0.106195 -0.419172 -0.237545 0.239415 -0.522242 -0.077540 -0.130952 0.277121 -0.369747 -0.402991 -0.169829 0.426654 -0.116552 -0.766732 0.081390 -0.911791 0.715959 -0.381901 -0.349137 -0.181093 -0.455068 0.015397 -0.090726 0.625154 -0.217917 -0.114165 -0.703684 0.440129 -0.359547 0.560279 -0.427051 1.102205 -0.420340 -0.304459 0.218370 -0.379963 -0.524547 0.073603 -0.480085 -0.049428 -0.239782 0.456085 -0.396663 5.602372 -1.571392 1.383075 1.301265 -1.334408
237 -0.075883 -1.209428 -0.246596 1.134232 -0.203506 -0.221629 -0.064508 -1.739376 -0.071114 -0.446105 0.167527 0.060634 0.856440 0.129363 -0.130952 -0.283146 -0.658179 1.344006 -0.040394 0.146700 -0.166308 -0.118279 0.495102 1.131657 0.480859 -0.873830 -0.837951 -0.181093 -0.880654 -0.358796 -0.522934 -1.199788 1.186656 -0.114165 -0.559012 -0.450574 -1.109888 -0.562159 2.970751 -0.528236 -0.087016 -0.185239 0.017838 1.090208 -0.155494 -0.309427 0.284035 -0.028265 0.360820 0.964162 -0.396663 -0.510460 0.193163 -1.178398 -0.821740 0.749396
238 -0.621471 -1.594110 -0.246596 -0.103031 -0.203506 -0.642206 -0.084117 0.453144 -0.106195 -0.021100 -0.440081 -0.281880 -0.436787 -0.226104 -0.130952 -0.274487 1.569260 -0.627906 -0.487735 -0.077263 -0.166308 0.522498 -0.437074 -0.309512 0.828548 0.522489 5.120794 -0.181093 -0.114599 -0.621460 -0.787843 0.075478 -0.217917 -0.114165 2.297150 -0.698026 0.384710 -0.359512 -0.427051 1.102205 -0.420340 -0.256496 0.046767 -0.379963 -0.505790 -0.218758 -0.480085 -0.065843 -0.138209 0.366948 -0.396663 -0.225749 1.315148 -0.777235 1.301265 -1.334408
239 0.401507 1.203008 -0.246596 -0.424602 -0.203506 -1.225052 -0.084117 -0.622063 -0.106195 -0.658227 1.889081 0.040843 -0.655804 -0.055190 0.053296 -0.360813 0.258478 -1.004004 -0.377567 0.062714 -0.166308 -2.023407 -1.171604 -1.815210 -1.327523 -1.909536 -0.918228 -0.181093 -2.035816 -0.986462 -2.656914 -1.720368 4.200198 -0.114165 -0.643692 0.055272 -1.118000 -0.837864 6.211538 1.102205 -0.100330 -0.304459 -0.407572 7.377589 -0.348116 -1.384504 3.183529 -0.051499 -0.239782 -0.509262 -0.063098 -0.761004 -1.505459 -1.017961 -0.825833 0.749396
240 1.219889 0.253052 0.119100 -0.156547 -0.203506 1.099710 -0.084117 0.502769 -0.003894 0.611436 0.572598 0.131348 -0.049735 -0.190574 -0.130952 0.383980 -0.045173 -0.118910 -0.267646 1.042553 -0.040005 -0.552345 0.625409 0.335788 1.093294 -0.120430 1.010521 -0.181093 0.542019 0.381288 0.053594 0.577707 -0.217917 -0.114165 0.740568 0.206661 -0.700242 -0.048645 -0.427051 1.102205 -0.420340 -0.028902 0.764918 -0.379963 1.517961 0.647225 -0.480085 -0.065843 -0.138209 1.254747 2.196528 -0.622547 1.977089 0.029638 1.301265 -1.334408
241 0.316259 0.090524 -0.215758 -0.375146 -0.045946 -0.420327 0.074592 -0.623567 -0.106195 -0.773235 1.787814 -0.500441 -0.617188 -0.255841 0.062258 -0.162119 -0.036699 -0.407302 -0.012729 0.566631 -0.166308 -0.571931 0.725326 -0.406307 -0.349436 1.745043 -0.943832 -0.041669 1.417510 -0.376184 -0.322597 -1.662178 -0.142991 -0.114165 0.463074 0.938679 -0.531923 0.195644 0.031432 -0.496350 1.750080 -0.281164 0.052730 0.042208 -0.412519 1.167184 -0.096141 -0.051702 0.184172 0.665555 -0.281097 -0.423963 -0.753372 -0.252631 -0.805177 0.749396
242 0.478230 -0.512060 0.461875 -0.107393 -0.203506 -0.450131 -0.084117 -0.108520 -0.106195 -0.202523 -0.845152 -0.122592 -0.939009 0.333688 -0.130952 -0.060672 0.572283 0.071517 -0.379914 1.658452 -0.070624 -0.087842 -1.506004 -0.567631 0.678560 -0.419697 0.066934 -0.181093 -0.552345 -0.463231 -0.989074 -0.646978 -0.174048 -0.114165 1.807833 -0.698026 -0.147626 -0.179095 -0.244914 -1.398554 -0.420340 -0.289528 0.536442 -0.226446 -0.235476 -1.234623 -0.187831 -0.061270 0.568381 -0.737006 -0.396663 -0.781193 0.222591 -1.217424 -0.813728 0.749396
243 -0.604421 -0.407671 -0.246596 -0.457515 -0.203506 -0.575973 -0.084117 -1.739376 -0.106195 0.367740 0.066259 0.056089 -0.719796 -0.158618 -0.130952 -0.313321 2.422341 -0.342637 -0.079916 -0.553185 -0.166308 0.025439 -0.547582 1.405909 0.711967 -0.146015 0.510047 -0.181093 -0.321312 -0.196267 0.105351 0.009678 -0.217917 -0.114165 0.014245 -0.351472 1.964483 -0.703525 -0.427051 -1.161915 -0.420340 0.058396 0.130942 -0.379963 -0.581032 -0.296475 -0.480085 -0.042517 0.625791 -0.399625 0.855475 -0.186783 0.691388 -0.287880 1.301265 -1.334408
244 -0.476549 -1.265903 -0.246596 0.215604 -0.203506 -0.506429 -0.084117 -0.031075 -0.106195 -0.319516 -0.541349 -0.324462 -0.218391 -0.122334 -0.130952 0.197945 -0.655125 0.605041 2.239783 2.134374 -0.166308 1.174391 -0.801519 1.357512 1.025895 -0.047164 0.439213 -0.181093 0.396104 0.259720 -0.434846 1.079040 -0.217917 -0.114165 -0.911063 -0.080309 -0.355491 0.178823 -0.427051 1.102205 -0.420340 0.074736 -0.453602 -0.379963 -0.453530 0.043997 -0.480085 -0.065843 -0.239782 -1.262465 -0.396663 0.766530 -1.884669 -0.091774 1.301265 -1.334408
245 0.034940 0.389056 -0.246596 -0.080527 -0.203506 0.129403 -0.084117 -0.888233 -0.005162 0.198541 1.889081 0.058516 -0.079337 -0.188970 -0.130952 -0.178125 -0.486744 0.589135 2.039950 0.846585 -0.166308 -0.577754 -0.134100 0.109933 0.820987 -0.606157 0.497339 0.192923 1.222957 0.308432 0.536808 -0.447340 -0.217917 -0.114165 -0.595550 0.027912 -1.389745 -0.023224 -0.427051 -0.631896 -0.420340 -0.055066 0.257107 -0.379963 0.403929 0.378918 -0.480085 -0.049360 -0.058718 -0.231157 -0.396663 0.981960 -0.619642 0.387999 1.301265 -1.334408
246 0.358883 -0.431169 -0.246596 -0.218356 -0.203506 -1.821145 -0.084117 -0.385969 -0.106195 1.102192 0.673866 -0.372097 -0.384596 0.170024 -0.130952 -0.026267 -0.658179 -1.250474 -0.865418 0.090709 -0.082106 -2.397656 -0.396785 0.163708 0.669103 0.825244 -0.213004 -0.181093 0.931126 -0.218717 0.575250 0.655592 -0.217917 -0.114165 0.485293 -0.698026 0.121077 -0.054598 -0.427051 1.102205 -0.420340 -0.304459 -0.270270 -0.379963 -0.345855 -2.200526 -0.480085 -0.055680 -0.239782 0.629901 -0.396663 1.118753 -1.013007 -0.802343 1.301265 -1.334408
247 -0.621471 1.465598 4.100041 -0.327159 2.179308 0.132715 0.302546 -0.619056 0.014484 -0.921802 -2.364171 0.392006 -0.769537 -0.321829 0.692807 -0.437169 1.946439 -1.046817 -0.287654 2.470318 2.646800 -1.212708 2.748534 0.314278 0.764034 1.969008 -0.272312 0.329852 2.147086 -0.382771 1.053453 -1.704254 -0.217917 3.703207 -0.634557 1.338735 -0.152696 -0.430948 -0.427051 1.102205 5.118344 -0.274509 -0.265145 -0.379963 -0.140390 0.969192 -0.480085 -0.061134 -0.217701 -1.004861 -0.056849 0.096552 0.501597 -1.018870 1.301265 -1.334408
248 -0.297528 0.192583 -0.246596 -0.157670 -0.203506 0.318166 -0.084117 0.553897 -0.106195 0.255618 1.484010 0.263067 0.451417 -0.039559 -0.130952 -0.327293 -0.246337 -0.494116 0.184141 0.510640 -0.154826 0.150366 -0.470687 -0.449327 0.768386 -0.876931 1.273653 -0.181093 0.031316 -0.345030 0.288185 0.025345 0.088969 -0.114165 0.023380 -0.698026 0.342123 -0.013548 0.684614 -0.888347 -0.420340 -0.304459 -0.147506 0.498426 -0.387764 0.399272 0.177094 -0.054383 -0.058718 1.145109 -0.396663 -0.461972 0.583921 1.788085 -0.816342 0.749396
249 -0.536223 1.475818 -0.236692 -0.380070 -0.203506 -0.877331 -0.045512 -0.462662 -0.082929 -0.225137 -0.743885 -0.407752 1.790808 -0.151789 -0.130952 0.086364 -0.556757 -0.299824 0.268743 -0.721158 -0.112725 0.423774 0.151263 0.389563 0.798573 0.628899 -0.727918 -0.181093 0.809530 0.111423 0.505309 0.556221 -0.161431 -0.114165 -0.254608 -0.146580 -0.811780 -0.400871 -0.081619 -0.509986 -0.068699 -0.005290 0.244293 -0.054978 -0.205000 -0.153995 -0.135098 -0.045029 -0.063134 -0.425474 -0.396663 -0.668739 0.916195 0.339532 -0.805164 0.749396
250 -0.502123 0.035928 -0.246596 -0.364521 0.089351 -0.314355 -0.004456 0.501265 -0.087365 -0.637480 0.673866 -0.204701 -0.848710 -0.014008 -0.130952 -0.361994 -0.412198 0.137669 -0.851585 0.118704 -0.166308 -1.361456 -0.254046 1.131657 0.692108 1.942842 0.542099 -0.181093 0.906807 1.228158 1.251105 0.677973 -0.217917 -0.114165 -1.400381 -0.173940 -0.144584 -0.099098 -0.427051 1.102205 -0.420340 7.859365 -0.188657 -0.379963 0.327505 -1.358599 -0.480085 -0.065843 0.546300 3.984994 0.894018 0.461758 0.266919 3.007171 1.301265 -1.334408
251 -0.689669 -0.815220 -0.246596 -0.296794 -0.203506 -0.383899 0.097265 -0.197996 -0.012133 -0.340807 -1.554028 -0.521224 -0.780775 -0.165735 0.247607 -0.325260 -0.552508 -0.035218 0.036674 -0.665167 -0.166308 -0.033848 0.562558 0.529378 0.781754 -0.668956 -0.792774 -0.181093 0.979765 -0.367437 0.191041 -0.760674 -0.217917 -0.114165 -0.240536 0.266244 -1.188978 -0.454592 -0.427051 -0.563464 1.996745 0.000362 -0.171704 -0.379963 -0.223552 -0.141042 -0.480085 -0.040185 -0.226533 -0.360405 0.137271 -0.129879 -0.235962 -0.824023 1.301265 -1.334408
252 0.316259 -0.216613 -0.246596 1.635443 1.690072 -0.238187 -0.084117 0.471189 -0.106195 0.204731 -0.338813 -0.665277 -0.538860 -0.170532 -0.130952 0.082363 -0.174335 -0.585688 -0.253196 -0.133254 -0.105070 -0.684418 -0.549999 -0.535366 -0.307448 -1.746626 -0.276476 0.238178 -0.114599 4.227401 -1.541155 -1.195759 1.030398 -0.114165 -0.994756 1.577067 0.978900 -0.155017 0.307778 -1.274642 -0.420340 -0.009375 -0.442070 1.139730 0.553602 0.019942 2.996120 -0.065843 -0.195620 -0.912159 0.128855 -0.394507 -1.465601 -0.083172 -0.829889 0.749396
253 0.938570 1.353977 -0.120320 -0.149118 -0.203506 -0.724996 -0.084117 -0.782968 -0.020009 0.052520 0.673866 -0.127148 -0.474248 0.397743 0.065096 -0.389807 -1.586338 0.196387 -0.566655 0.538635 0.116916 -1.068461 -0.311487 0.432583 0.829188 -1.350543 -1.050346 -0.114709 -1.549432 0.346237 -0.089437 1.118430 -0.109022 -0.114165 -0.499760 0.375074 -0.862478 0.735601 -0.119302 0.347691 -0.018057 -0.252640 -0.207336 -0.103261 0.304731 -0.722064 -0.012205 -0.056721 0.802439 -0.365753 -0.396663 -0.586684 0.411451 -0.408173 -0.819774 0.749396
254 -0.433925 -1.505937 -0.246596 -0.384951 -0.203506 0.179078 0.030473 1.051650 -0.106195 -0.562904 -0.338813 -0.117970 -0.713346 0.067800 -0.130952 -0.378065 -0.084724 0.425168 0.264667 -0.049268 0.048024 -0.436153 0.092095 0.432583 -1.327523 0.305695 -0.731424 -0.181093 0.639296 -0.382919 -0.212424 -0.190407 -0.217723 -0.114165 1.217788 -0.099765 0.467856 -0.532975 -0.408210 -1.147921 -0.420340 -0.293084 -0.266968 -0.376868 -0.285626 0.203131 -0.433762 -0.055951 -0.014556 -0.687535 -0.396663 1.187956 0.776133 0.139860 -0.717193 0.749396
255 0.120188 0.364555 0.411867 0.627276 -0.203506 0.814910 0.081333 1.248646 -0.106195 -0.177916 0.572598 0.060987 0.173000 -0.171325 0.281324 3.356559 -0.742370 0.526551 -0.789338 1.238521 1.755026 -0.388512 -0.225038 1.669407 0.903949 0.354151 0.061615 -0.068622 -0.430749 0.181907 -1.060590 1.348506 -0.217723 0.762087 0.834630 0.593342 -1.003421 1.230530 -0.094180 -1.484832 0.982532 0.003496 0.899904 0.003829 -0.479250 0.190178 -0.439920 -0.055196 -0.041053 -0.506588 -0.017764 -0.606691 -0.562649 -0.002813 0.724563 0.749396
256 -0.433925 -0.039527 -0.246596 -0.357740 -0.203506 -0.522987 -0.084117 0.683223 -0.106195 -0.089354 -0.440081 -0.169841 2.214927 -0.211069 -0.130952 -0.187046 1.382428 -0.300864 0.217364 -0.889130 -0.166308 0.226327 -1.077673 -0.062147 -1.327523 -0.869178 1.832051 -0.181093 0.006997 -0.212173 0.966617 1.390582 -0.093493 -0.114165 -0.932789 -0.198867 1.147219 -0.250877 0.018871 1.102205 -0.420340 -0.304459 -0.121337 0.027971 0.059121 -0.747970 0.016136 -0.065843 -0.010140 -1.241072 0.891468 -0.651005 0.461367 -0.571128 -0.816439 0.749396
257 -0.459499 -0.632986 -0.246596 -0.249498 -0.203506 0.142650 -0.084117 0.573447 -0.106195 -0.105048 -0.338813 -0.399235 0.004090 -0.263720 -0.130952 -0.300988 -0.920906 -0.650204 0.122388 -0.777149 -0.166308 -0.185771 0.016697 -1.481806 0.838993 0.718834 -0.927135 -0.181093 -0.528026 -0.623536 -1.481022 -0.331855 -0.217917 -0.114165 0.182865 -0.698026 -0.422413 -0.238872 -0.427051 1.102205 0.009418 -0.304459 -0.307528 -0.379963 -0.500253 0.835964 -0.480085 -0.055874 -0.169123 -0.148260 -0.396663 2.363818 0.930723 -1.012506 1.301265 -1.334408
258 -0.450975 1.392163 1.901513 -0.436265 -0.203506 -1.185313 0.064175 -0.238598 -0.073423 -0.663052 0.775134 -0.712250 -1.048208 -0.234881 0.223391 -0.327096 -0.577526 -0.226091 -0.165877 -0.665167 -0.166308 -0.633997 -0.404613 2.325461 0.968494 -1.076377 -0.684810 -0.181093 0.785211 0.412803 1.151313 -1.091016 -0.217917 -0.114165 -0.146968 0.086279 -1.061217 -0.837864 -0.427051 1.102205 0.514292 1.985502 0.484843 -0.379963 -0.178398 0.078229 -0.480085 -0.051954 -0.239782 -0.479847 1.173897 -0.705485 -0.371182 1.477211 1.301265 -1.334408
259 -0.007684 -0.528629 -0.246596 -0.367674 -0.203506 0.218817 -0.084117 -0.397999 -0.106195 -0.763479 -0.541349 -0.278526 -0.623864 0.403148 -0.029620 -0.390791 0.129671 0.876783 -0.084362 -0.441204 -0.166308 1.348283 0.385400 0.991842 -1.327523 0.480526 0.496896 -0.181093 2.560513 -0.140990 0.014078 0.407611 -0.126492 -0.114165 0.576886 0.279012 1.713017 -0.091463 -0.043935 -1.173295 -0.420340 -0.299282 0.010446 0.072540 -0.370646 1.983206 0.066290 -0.051978 -0.239782 -0.088539 -0.396663 -0.781193 0.699397 0.659008 -0.813714 0.749396
260 0.034940 -0.083015 -0.246596 -0.358085 -0.203506 -0.562727 -0.064508 -0.673192 -0.106195 -0.532242 1.484010 0.004284 -0.438252 -0.008779 -0.130952 -0.041485 0.240892 -0.459331 -0.062749 0.034718 0.101607 0.259941 0.193279 -1.739925 0.445723 0.451840 -0.117748 -0.181093 -0.625302 -0.986462 -0.832871 0.809573 -0.059524 -0.114165 0.281123 -0.698026 -1.069329 -0.027170 -0.043935 -0.147908 -0.159283 -0.157612 4.545558 -0.085310 -0.396783 -0.165097 0.001476 -0.056305 -0.071966 -1.174220 -0.396663 -0.716728 -0.573452 -1.166369 -0.821632 0.749396
261 0.213961 -0.466515 -0.006050 1.526424 -0.203506 0.914259 -0.024065 -1.047634 -0.106195 -0.109231 -0.237545 -0.426440 -0.079748 0.413800 -0.130952 -0.437169 -0.400186 -0.044583 0.210201 0.118704 -0.166308 0.015911 0.122600 -0.201962 -1.327523 -0.298556 -0.471744 -0.181093 0.858169 -0.041744 0.043930 -0.381093 0.014238 -0.114165 0.491219 -0.225011 0.167720 -0.011531 0.194728 -0.744779 0.111625 -0.064433 -0.191713 0.088635 -0.373253 -0.104034 0.076572 -0.052477 0.078184 1.556028 -0.396663 -0.587182 0.920293 -0.437407 -0.820518 0.749396
262 -0.144081 -1.594110 -0.246596 -0.383742 -0.203506 0.304919 -0.084117 -1.022070 -0.106195 0.232244 -0.743885 -0.161060 0.364859 -0.007181 -0.130952 -0.437169 -0.216916 -0.970408 0.125969 -0.441204 -0.166308 -0.106104 -0.656708 1.292982 -1.327523 0.721160 1.076169 -0.181093 1.271595 0.368581 -0.287305 -0.869893 -0.189577 -0.114165 -0.079323 -0.103413 0.114993 -0.378637 -0.420771 -1.370101 -0.420340 -0.304459 -0.285893 -0.150307 -0.550724 0.360414 4.791151 -0.059503 0.510970 -1.107368 -0.392359 -0.655797 -0.525771 1.187109 -0.830611 0.749396
263 -0.161131 -1.050255 0.028839 0.030522 -0.203506 -0.152085 -0.084117 0.399759 -0.106195 0.593492 0.977670 -0.189996 0.709049 -0.171664 -0.130952 -0.170482 -1.722010 0.059179 0.510939 0.818590 -0.105070 0.121516 0.120182 0.765987 -1.327523 -0.316291 -0.636517 -0.181093 1.368872 -0.583740 0.019447 -0.043588 0.785821 2.499376 2.114953 -0.272434 0.717294 0.306185 0.559002 1.102205 -0.420340 -0.188478 -0.133806 1.294485 0.205217 0.806358 2.076007 -0.059561 0.003109 0.457422 1.101831 -0.621110 0.322050 -0.529165 -0.828131 0.749396
264 0.631677 -0.789028 -0.246596 -0.337396 -0.203506 -0.201759 -0.084117 -1.739376 -0.106195 -0.103123 -0.035009 0.576523 -0.789196 -0.222943 0.351111 -0.376622 -0.816430 -0.677706 -0.472049 -0.693162 -0.166308 -1.065285 0.393919 -0.288002 -1.327523 -0.283631 0.870854 -0.181093 -0.017322 -0.300787 0.593934 -0.544921 -0.217723 -0.114165 1.016087 -0.698026 0.033875 0.041154 -0.307720 -1.420274 -0.194917 -0.185221 -0.161059 -0.164544 0.001216 0.230887 -0.405556 -0.055845 0.003109 -0.585028 -0.396663 -0.524037 0.478502 -0.261933 -0.255831 0.749396
265 -0.101457 -0.480468 0.140559 0.215949 -0.000276 0.934128 -0.084117 1.126840 -0.106195 0.117478 -0.237545 -0.254565 0.046423 0.398045 -0.130952 0.118901 0.069149 0.986341 -0.272586 -0.217240 0.101607 0.462946 0.182919 0.206728 0.684859 2.670075 -0.574213 -0.181093 0.104274 0.798814 7.779647 0.000278 -0.217917 -0.114165 1.168906 0.085063 -0.414302 0.440752 -0.427051 1.102205 -0.273034 -0.304459 0.050168 -0.379963 -0.431827 0.634272 -0.480085 -0.065843 -0.239782 -0.107257 0.009590 -0.430345 -0.531359 1.848091 1.301265 -1.334408
266 -0.621471 -0.498351 -0.246596 -0.400242 -0.203506 0.553291 0.041502 -0.569431 -0.096327 -0.269511 1.078938 -0.340392 -0.963119 -0.272048 -0.130952 0.300802 -1.028505 -0.413991 -0.510089 -0.805144 -0.166308 0.099019 -0.407951 0.131443 -1.327523 -0.395953 0.523628 -0.181093 -0.138918 -0.270649 0.873912 -0.872578 -0.217917 -0.114165 -0.965130 -0.256626 0.733518 -0.563043 -0.427051 1.102205 0.467114 -0.053165 -0.351537 -0.379963 -0.023626 -0.725765 -0.480085 -0.051146 -0.014556 -0.816783 0.378973 0.005370 -0.777212 -0.084780 1.301265 -1.334408
267 -0.689669 -0.127728 -0.246596 -0.489003 -0.203506 -0.201759 0.039664 -0.548378 -0.106195 -0.434442 -2.566707 -0.581083 -0.581812 -0.191442 -0.130952 -0.437169 1.664777 -0.342488 -0.287901 0.398658 -0.166308 1.146336 -0.075163 -0.417062 -1.327523 0.022032 -0.006251 -0.181093 1.563425 -0.488624 0.663374 -0.872578 -0.217917 -0.114165 0.886474 -0.698026 -0.688075 -0.524698 -0.427051 -1.025767 0.195822 -0.304459 -0.349862 -0.379963 -0.283402 1.091318 -0.480085 -0.055845 -0.239782 -0.644750 -0.396663 -0.534608 -0.147306 -0.099957 1.301265 -1.334408
268 -0.331627 -0.280245 -0.246596 0.062873 -0.203506 0.202259 -0.072474 -0.415293 -0.106195 0.186854 -0.237545 -0.152863 -0.761509 0.147713 -0.130952 -0.306958 -0.727174 -0.483264 -0.174769 -0.469199 -0.030437 0.088696 0.004495 0.454093 0.968643 -0.493835 0.052285 -0.181093 0.006997 -0.428518 -0.292316 0.996678 1.945622 -0.114165 -0.435078 0.409121 -0.254094 -0.394094 2.556231 1.102205 -0.420340 -0.123665 -0.234934 2.718229 6.893464 0.915531 1.803257 -0.057974 0.312242 -0.705808 -0.167204 -0.687277 -0.583509 -0.327535 -0.826378 0.749396
269 -0.544747 -0.933298 -0.246596 -0.444817 -0.203506 -0.572662 0.009025 0.199004 -0.106195 -0.445798 -2.870511 -0.341694 -0.010359 -0.219814 -0.130952 -0.186783 -0.013030 -0.311122 0.823163 -0.637172 -0.166308 0.497089 0.111319 0.292768 0.699768 -0.577858 0.391363 -0.181093 0.371785 -0.001419 0.714774 0.095174 -0.217917 -0.114165 -1.156463 -0.391599 -0.491364 -0.492774 -0.427051 -0.890105 0.579711 -0.304459 -0.402693 -0.379963 0.393311 -0.344585 -0.480085 -0.054190 -0.239782 -0.507480 -0.396663 -0.623147 0.306777 0.312746 1.301265 -1.334408
270 -0.340152 -0.879564 -0.246596 0.102568 -0.055080 1.040100 -0.084117 -1.056657 -0.106195 -0.185315 -0.338813 -0.293728 -0.525425 -0.063212 -0.130952 -0.319290 -0.133743 0.716829 -0.306427 -0.497194 -0.032350 -0.399364 -0.569453 2.024321 0.930032 0.562611 -0.571567 -0.181093 0.201550 0.494554 1.556639 0.511459 -0.217917 -0.114165 0.713658 -0.327153 -0.189199 0.004178 -0.427051 1.102205 -0.420340 0.030647 0.020795 -0.379963 2.157819 0.436280 -0.480085 -0.065843 -0.239782 -0.078734 -0.396663 0.962311 -0.598410 0.429472 1.301265 -1.334408
271 -0.229329 1.640664 -0.246596 -0.346510 -0.203506 0.473812 -0.084117 0.039603 -0.106195 -0.414208 0.370063 -0.298846 2.169214 -0.012176 -0.130952 0.212377 -0.234348 -1.250474 -0.407085 -0.021273 -0.166308 -0.079107 -0.386771 0.981087 0.761329 0.150150 -0.375762 -0.181093 -0.187557 0.235258 0.134273 0.689611 -0.168225 -0.114165 -1.400381 -0.698026 -0.189199 -0.009731 -0.244914 1.102205 -0.420340 -0.304459 -0.438719 -0.189924 -0.499950 -2.200526 -0.103253 -0.042822 -0.085215 -0.774889 -0.396663 -0.446793 -0.589842 0.021735 -0.815897 0.749396
272 0.273635 0.492053 0.646711 1.404792 -0.169254 1.016919 -0.084117 -0.504768 -0.106195 -0.211656 0.876402 -0.443958 -0.533114 0.292829 -0.130952 -0.435988 -1.009830 -0.200968 -0.473655 -0.301227 -0.166308 -1.140188 -0.148719 -0.868771 0.855327 -1.372833 -0.470978 0.156819 0.371785 0.549578 -1.040975 -0.713226 -0.217723 -0.114165 1.754013 -0.698026 -1.389745 -0.425432 -0.420771 1.102205 -0.420340 -0.211931 -0.417823 -0.268540 0.676102 -0.907103 0.030616 -0.065843 0.515387 0.186893 -0.396663 -0.252123 0.275487 -0.078346 -0.774863 0.749396
273 0.077564 0.515250 -0.246596 0.032768 0.468979 1.397756 0.051920 0.331337 -0.106195 0.231906 0.471331 0.577516 0.113824 -0.005497 -0.130952 -0.149262 -0.418766 -0.445506 0.475616 0.958567 -0.013214 0.036820 1.456747 0.862782 -1.327523 0.267802 1.515897 -0.181093 0.760892 0.327959 0.228625 2.030677 -0.217917 -0.114165 1.011149 0.065608 0.153524 0.375159 -0.427051 1.102205 0.188433 -0.304459 -0.330839 -0.379963 -0.249313 0.223485 -0.480085 -0.041840 -0.239782 0.188675 -0.396663 0.095449 0.743353 1.468609 1.301265 -1.334408
274 -0.416875 -0.831418 0.515559 -0.417368 -0.203506 -0.132215 -0.084117 0.996010 -0.106195 0.317686 0.572598 0.248571 -0.824713 -0.166710 -0.130952 -0.239786 -0.037717 -0.315581 0.394596 -0.189245 -0.047660 1.197153 0.110858 -0.304134 -1.327523 0.662722 2.476114 -0.181093 -0.929292 -0.505462 -0.236048 -0.020760 -0.111933 -0.114165 -0.634063 -0.698026 1.992874 -0.189481 -0.081619 1.102205 -0.420340 -0.304459 -0.278993 0.005686 -0.342032 -0.524073 0.081270 -0.055235 -0.230949 1.088953 -0.396663 -0.612714 0.690830 -1.061112 -0.817853 0.749396
275 1.245463 0.797442 -0.246596 -0.275716 -0.203506 0.457254 -0.084117 0.941874 -0.106195 -0.380047 0.775134 -0.287197 0.801855 -0.138586 -0.130952 -0.222665 -0.428589 0.092477 0.723741 -0.049268 -0.166308 1.175450 0.474958 0.733722 0.598303 0.344848 -0.614876 -0.181093 0.493381 -0.525561 0.266781 -0.258445 -0.217723 -0.114165 -0.488651 -0.056598 -0.059410 0.041023 -0.370526 1.102205 -0.162516 -0.192757 -0.301762 -0.379963 -0.564469 0.312304 -0.473894 -0.060244 -0.239782 -0.667925 -0.396663 0.229856 1.505870 -0.551056 -0.544182 0.749396
276 -0.416875 -0.309428 -0.246596 -0.407455 -0.203506 -0.519676 -0.084117 -0.928835 -0.106195 -0.402260 -1.351492 -0.270539 1.181472 -0.321829 -0.130952 -0.056704 0.594247 -0.367165 0.979029 -1.924960 -0.166308 0.360252 -0.322423 0.292768 -0.018727 1.147770 -0.956607 -0.181093 0.785211 -0.472062 0.729377 0.299288 -0.174048 -0.114165 -0.671342 -0.698026 0.435409 0.367824 -0.169547 1.102205 -0.420340 -0.304459 -0.373222 -0.249350 -0.520360 -0.392695 -0.303519 -0.065843 -0.239782 -0.110823 0.180087 -0.444652 -0.520184 -0.371875 -0.806205 0.749396
277 -0.169656 -0.556447 -0.246596 -0.231314 0.321124 0.450631 0.119325 -0.602514 0.025981 -0.383035 1.382742 -0.224691 -0.605049 -0.145411 0.435608 -0.054867 -0.775838 -1.250474 1.022997 -0.077263 -0.166308 0.199860 1.415768 -0.105167 0.728506 0.046260 0.559872 -0.181093 0.128593 -0.031875 0.673468 0.260792 -0.217917 -0.114165 -1.400381 1.251185 -0.813808 -0.247871 -0.427051 -0.270305 2.988330 -0.304459 -0.258836 -0.379963 -0.220797 -2.200526 -0.480085 -0.051296 -0.239782 -0.943357 -0.396663 0.094420 1.331538 0.341980 1.301265 -1.334408
278 -0.561797 -0.486427 -0.246596 -0.345689 -0.203506 -0.244811 -0.084117 -1.739376 -0.106195 -0.192479 -0.237545 0.180351 1.868911 -0.047704 -0.130952 1.397066 -0.167211 -0.523699 1.243456 -0.581181 -0.166308 0.005059 -1.087342 -0.363287 0.644077 1.432306 3.722018 -0.181093 0.225869 -0.508639 -0.009117 -0.583417 0.035590 -0.114165 -0.829840 -0.284594 -0.028991 -0.045453 0.069116 -1.293957 -0.217082 -0.190098 -0.295651 -0.129260 -0.187594 -0.574033 -0.074596 -0.056193 -0.239782 0.210960 -0.200869 -0.214299 -0.715749 -0.280676 -0.823728 0.749396
279 -0.450975 0.067449 -0.246596 -0.132834 -0.203506 -0.807787 -0.084117 -1.739376 -0.106195 0.030756 0.572598 0.242239 0.584951 -0.197541 -0.130952 -0.065166 -0.028628 -0.178521 -0.084362 0.258681 -0.020868 -0.640217 -0.023362 0.507868 1.934562 0.154996 1.086446 0.671757 -0.211876 0.105154 -0.660203 0.293916 -0.217917 -0.114165 0.147067 0.384194 0.595617 0.055399 -0.427051 1.102205 -0.420340 4.765445 -0.439409 -0.379963 -0.236712 0.623169 -0.480085 -0.065843 -0.239782 -0.241853 0.287619 -0.518257 0.251274 0.444229 1.301265 -1.334408
280 -0.374251 -0.848106 -0.246596 -0.462483 -0.203506 -0.413704 -0.058380 1.196014 -0.106195 -0.419107 -1.047688 -0.474648 -0.936841 -0.321829 -0.130952 -0.288525 -0.406600 -0.413099 0.078666 -0.609176 -0.166308 -1.196299 -0.163799 0.733722 -1.327523 -0.296811 -1.048868 -0.181093 0.590657 -0.449189 -1.578595 -0.075817 -0.217917 -0.114165 5.093074 -0.049910 -0.836115 -0.245075 -0.427051 1.102205 -0.420340 -0.057954 -0.270122 -0.379963 0.163487 -0.622143 -0.480085 -0.065843 -0.239782 -0.655446 -0.396663 -0.362295 -0.418117 -1.029081 1.301265 -1.334408
281 -0.399826 -0.715124 -0.246596 -0.416655 0.376498 -0.218318 0.020055 -1.739376 -0.060567 0.097191 -0.237545 0.233722 -0.502047 -0.108185 -0.130952 -0.278817 -0.014048 -0.637420 0.317281 0.454649 -0.166308 0.157247 0.341313 -0.954811 0.642094 0.279626 0.682773 -0.181093 0.031316 0.118836 0.325912 -0.816178 -0.217917 -0.114165 3.923106 -0.265138 0.378627 -0.432616 -0.427051 -1.071577 0.919731 -0.078696 -0.064809 -0.379963 -0.060025 -0.298325 -0.480085 -0.052602 -0.239782 -0.477173 -0.044384 -0.485893 -0.896413 -0.427476 1.301265 -1.334408
282 1.935973 1.415996 -0.246596 -0.081218 -0.203506 0.020119 -0.084117 -0.513791 -0.106195 -0.501460 0.572598 2.055704 -0.597782 -0.183751 -0.130952 -0.437169 -1.005428 1.065277 -0.175263 -0.385213 -0.166308 -1.073490 -0.128574 -0.244982 0.639408 -1.111266 -1.065028 0.066809 0.858169 4.678707 -1.197966 -0.367664 -0.217917 -0.114165 1.207172 -0.312561 -0.704298 1.055557 -0.427051 1.102205 0.109162 -0.284527 0.205951 -0.379963 1.006565 -0.405648 -0.480085 -0.065843 0.109097 0.678034 -0.396663 -0.508405 -0.334304 -0.132758 1.301265 -1.334408
283 -0.459499 0.961046 -0.246596 -0.434062 -0.203506 0.993738 -0.084117 -0.001000 -0.106195 -0.460134 0.471331 0.608912 -0.518637 -0.073020 0.061284 0.202078 0.099232 -0.020501 -0.225160 -0.077263 -0.166308 -0.682830 -0.478860 1.217697 0.937586 -0.533278 -0.306164 -0.181093 0.201550 0.086813 0.538526 0.621573 -0.217917 -0.114165 1.586875 -0.698026 -0.832059 -0.105941 -0.427051 1.102205 -0.175600 1.084280 0.647723 -0.379963 0.011164 -0.178050 -0.480085 -0.051741 -0.186787 -0.628705 1.517553 -0.140445 -0.717611 0.052088 1.301265 -1.334408
284 -0.527698 -0.252290 -0.246596 -0.383224 -0.203506 -0.407080 -0.084117 -1.449145 -0.106195 -0.346030 0.572598 -0.276077 -0.562209 -0.155493 -0.130952 -0.437169 -0.060700 -0.421275 -0.865418 -0.161250 -0.166308 -0.475854 0.399099 0.163708 2.557878 0.027265 -0.474860 -0.181093 -0.211876 -0.512980 0.972201 -0.742769 -0.093493 -0.114165 -0.736519 0.472960 -0.462972 -0.024845 -0.094180 -0.801838 -0.420340 0.148829 -0.265391 -0.082215 -0.310195 0.005139 0.015413 -0.054064 -0.239782 -0.267703 -0.396663 -0.668705 -0.205417 -0.055267 -0.820463 0.749396
285 0.239535 -1.315615 -0.035161 0.793439 -0.203506 1.487170 -0.084117 -0.246117 0.038067 -0.151411 -0.845152 -0.341120 2.648425 0.086004 0.023115 -0.268911 0.704143 7.408249 -0.011123 0.622622 -0.166308 0.955241 -0.342337 -0.051392 -0.355013 -0.519032 -0.682983 -0.181093 -0.138918 0.033971 -0.080704 2.536486 -0.217723 -0.114165 -0.433843 -0.698026 1.660290 -0.225163 -0.244914 -1.503822 -0.420340 -0.298525 -0.366322 -0.194257 -0.461602 0.802657 -0.443317 -0.060950 0.466809 -0.136672 -0.396663 -0.520535 0.665127 0.477380 0.032520 0.749396
286 -0.442450 -0.941854 -0.159486 -0.206996 -0.203506 0.159208 -0.084117 1.335866 -0.069711 0.981177 -0.136277 -0.079690 -0.264610 -0.133072 -0.130952 -0.219910 0.585975 -0.616757 0.365448 0.118704 -0.154826 -0.007645 -0.167482 0.561643 0.702778 -0.312511 0.725451 -0.181093 2.341640 -0.361761 -0.692632 0.721840 -0.217917 -0.114165 0.609721 -0.698026 0.772049 -0.300222 -0.427051 1.102205 -0.420340 -0.304459 -0.185454 -0.379963 -0.406203 2.706709 -0.480085 -0.065843 0.020774 0.303661 -0.396663 -0.637150 2.148069 -0.141780 1.301265 -1.334408
287 0.009365 -1.594110 -0.246596 -0.030726 -0.203506 -1.380698 -0.076763 -1.739376 -0.068081 -0.949642 -0.338813 -0.628365 1.347227 -0.113894 -0.130952 12.571715 -0.098925 1.600882 0.751036 -0.273231 -0.166308 0.286408 -0.676968 0.454093 0.756088 -1.637308 -0.514717 -0.181093 -1.184644 0.898906 0.091177 0.176640 -0.217917 -0.114165 -1.292741 -0.698026 0.380655 -0.340579 -0.427051 0.164137 0.148259 1.160697 -0.503674 -0.379963 -0.417380 -1.878558 -0.480085 -0.057722 -0.239782 -0.741908 0.087888 0.894414 -1.502479 1.973839 1.301265 -1.334408
288 0.120188 0.713588 -0.045440 -0.458422 -0.203506 0.136026 0.011476 -0.426571 0.061651 -0.799669 -1.148956 -0.545097 2.648425 -0.243332 0.216107 -0.323160 -1.679666 -0.549862 0.082618 -0.721158 -0.135689 -0.676213 -0.066874 -1.707660 -1.327523 0.018930 -0.472388 0.630163 -1.962858 -0.787335 -1.513164 -0.228902 -0.217917 -0.114165 0.789204 1.316848 -1.245761 -0.390274 -0.427051 1.102205 1.351337 -0.169550 0.418508 -0.379963 -0.170111 -0.095707 -0.480085 -0.065843 -0.173539 -0.518176 -0.396663 -0.717826 -0.780564 -1.146157 1.301265 -1.334408
289 -0.391301 -0.414016 -0.246596 -0.118062 -0.203506 0.437384 -0.084117 -0.785976 -0.061020 -0.155196 -0.237545 -0.281659 -0.902648 0.385721 -0.130952 -0.101441 0.230430 1.146443 -0.012482 -0.581181 -0.131862 0.349665 -0.273731 1.422042 0.751878 1.185760 0.824670 -0.181093 0.371785 -0.200567 -0.054073 0.286754 -0.217723 -0.114165 0.043624 -0.108277 0.666595 -0.350583 -0.282598 -1.316602 -0.420340 -0.298877 -0.227542 -0.189924 -0.313322 2.162694 -0.428603 -0.063999 -0.204452 0.058536 -0.396663 -0.687866 -0.343244 -0.770381 -0.140490 0.749396
290 -0.092932 -0.250816 -0.246596 -0.119876 -0.203506 -0.791229 -0.084117 0.565928 -0.058576 -0.220821 0.977670 0.000379 2.648425 -0.099157 -0.130952 0.056452 0.751150 -1.250474 0.846630 -0.245236 -0.166308 0.054024 -0.020830 -0.944056 0.660647 0.470253 0.661831 -0.181093 -0.917133 -0.425595 -0.272558 -0.178769 0.142349 -0.114165 -0.231154 -0.623851 0.334012 -0.173331 0.056555 -0.790874 -0.420340 -0.304459 -0.202407 -0.073548 -0.653942 -2.200526 0.121262 -0.046878 -0.239782 -0.416561 -0.396663 -0.210558 -0.684458 1.821445 -0.826487 0.749396
291 1.232676 0.447686 3.727518 0.454634 -0.203506 -0.898857 0.071529 -1.739376 0.050334 -0.057405 0.167527 -0.342940 1.154616 -0.176669 0.431203 1.434850 -0.790726 2.049598 0.188834 3.254190 -0.166308 4.935814 -0.057435 -1.196798 1.311744 -1.054281 1.185611 -0.181093 -1.148165 0.295174 -0.216576 1.322544 -0.032349 -0.114165 -0.624805 0.179910 -0.179059 -0.550905 -0.119302 -0.180465 0.807828 0.138018 -0.344194 -0.139783 -0.604883 -0.595313 0.059887 -0.040282 -0.164706 -0.843078 -0.396663 -0.663269 0.173234 -0.310680 -0.825024 0.749396
292 -0.587372 -0.712942 -0.246596 -0.385297 -0.203506 -0.373964 -0.084117 -0.996506 -0.106195 -0.631022 -2.060368 -0.334038 -0.370035 -0.054508 -0.130952 -0.344676 1.165359 -0.093639 -0.582588 -1.001112 -0.166308 2.317256 -0.304466 0.583153 -1.327523 0.782021 -0.513992 -0.181093 -0.868494 -0.547164 -0.041617 -0.342598 -0.217917 -0.114165 1.071388 -0.497389 -0.692131 -0.126354 -0.427051 0.456857 -0.420340 -0.304459 -0.450498 -0.379963 0.972089 -0.179900 -0.480085 -0.065843 -0.217701 -0.052884 -0.396663 2.163219 -0.273958 -0.911096 1.301265 -1.334408
293 -0.450975 0.856515 -0.246596 -0.354716 -0.203506 -0.281239 0.235753 -0.010022 -0.009870 0.133590 -0.743885 -0.063517 0.108726 -0.167860 -0.130952 -0.099276 -0.539324 -0.119059 -0.799218 -0.469199 -0.166308 -0.842164 0.100498 1.045617 0.726057 -0.471254 0.155863 0.157651 1.198637 0.214693 0.627437 0.170373 -0.217917 -0.114165 -0.484947 -0.633579 -0.637376 -0.417327 -0.427051 -0.526455 -0.151741 -0.304459 -0.379284 -0.379963 0.064198 -0.346435 -0.480085 -0.062097 -0.239782 -0.707145 -0.396663 1.242062 1.288327 -0.001834 1.301265 -1.334408
294 1.168740 -0.389457 -0.246596 0.333909 -0.203506 1.119579 -0.084117 0.887738 -0.106195 0.419835 2.192885 0.167091 2.648425 -0.001031 -0.130952 -0.254742 0.268987 -0.191008 0.324074 0.258681 0.266183 0.020675 0.592832 -1.083871 0.799984 0.343103 -0.397806 -0.181093 -0.625302 -0.266858 4.840704 1.080830 -0.217723 -0.114165 0.141636 0.153158 0.636176 -0.077722 -0.420771 1.102205 -0.420340 -0.134230 11.294164 -0.303824 -0.215470 0.575059 -0.100098 -0.049467 0.731780 0.507784 -0.331755 -0.392357 2.114171 -0.252981 -0.774863 0.749396
295 2.132044 -0.820638 -0.171866 -0.222071 1.052409 0.958966 -0.084117 0.860670 0.084374 -0.610547 -0.440081 -0.715604 -0.947346 1.543847 -0.130952 -0.283080 -0.229638 0.217645 -0.535655 -0.357217 -0.166308 -0.406642 -1.122912 1.852241 0.215864 -1.189184 -0.841752 0.648797 -0.211876 0.553220 0.317393 -1.668444 -0.217917 0.299621 2.824488 1.561259 -0.848283 -0.016283 -0.427051 -0.851980 0.301876 -0.119615 -0.288209 -0.379963 0.073403 -0.428778 -0.480085 -0.065843 -0.239782 -0.025252 -0.396663 -0.696581 0.011381 -0.518675 1.301265 -1.334408
296 0.576266 1.851696 -0.246596 -0.205355 -0.203506 -0.546169 -0.084117 -1.739376 0.029104 0.310004 1.686546 0.286542 0.234288 0.216127 0.003611 -0.293248 -0.840431 3.775857 -0.102147 1.070548 -0.166308 -0.821652 -0.536071 -0.766599 -1.327523 -0.787868 -0.667723 -0.181093 -0.880654 -0.561735 -0.755020 -0.823340 0.251050 -0.114165 1.310739 0.148294 -0.012768 0.053479 0.056555 -1.501578 -0.420340 -0.255422 -0.322116 0.595612 -0.602321 0.035670 1.606591 -0.011976 -0.239782 0.413299 -0.396663 -0.713439 -0.434135 -1.043138 -0.827886 0.749396
297 0.171337 -0.827757 -0.246596 0.535923 -0.203506 -0.522987 -0.084117 0.126822 -0.106195 -0.301378 0.876402 -0.244438 0.298196 -0.152361 -0.130952 0.069899 -0.338137 0.691707 -0.190702 1.168532 -0.166308 0.667540 -0.572561 -1.239818 -0.339711 -0.046001 0.113091 -0.181093 -0.503706 -0.472105 -0.594056 1.090678 -0.216752 -0.114165 -0.389652 -0.372144 0.291425 0.176798 -0.282598 -1.330989 -0.420340 -0.163634 -0.380171 -0.144117 0.741714 -0.799781 -0.380344 -0.059411 0.413814 -0.540460 -0.396663 0.027427 -0.104282 -0.743245 -0.486512 0.749396
298 0.759549 2.084582 -0.246596 0.146884 -0.203506 0.208882 -0.009358 -0.216041 -0.106195 0.880728 1.585278 6.296110 0.170014 0.171411 -0.130952 -0.273962 3.111041 0.635218 0.077678 0.958567 -0.040005 -0.395129 0.543795 1.475817 0.328950 -0.160746 -0.707593 0.189429 6.135435 10.139603 0.477462 0.808678 0.512127 -0.114165 -0.602710 0.343458 -0.722550 0.009490 0.332901 -0.970323 -0.420340 -0.031419 0.441375 0.288578 -0.700328 0.754547 0.668890 -0.048373 1.350046 -0.186589 0.114907 -0.165294 -0.582764 0.954356 -0.827861 0.749396
299 0.222486 0.146050 -0.246596 0.300737 -0.203506 -0.120625 -0.084117 0.447129 -0.106195 -0.135225 0.977670 -0.684715 -0.524354 0.083506 -0.130952 -0.434086 -0.814395 0.334934 -0.228618 -0.553185 -0.028523 -1.209532 -0.181526 -1.126891 -1.327523 -1.467905 -0.798430 0.286760 -1.160325 -0.133874 -0.891787 0.136354 -0.217917 -0.114165 -0.380640 -0.548460 -1.249817 0.226787 -0.427051 1.102205 -0.191300 0.040507 -0.290969 -0.379963 2.340720 -1.125450 -0.480085 -0.065843 -0.217701 -0.345252 -0.396663 -0.294500 0.230414 -0.690232 1.301265 -1.334408
300 -0.024734 -0.973430 -0.167889 -0.432464 -0.203506 -0.562727 -0.084117 -1.739376 -0.063419 -0.309191 0.572598 -0.197288 0.440714 -0.088283 -0.130952 -0.437169 -0.526496 -0.380990 -0.506755 -0.413208 -0.166308 -0.015585 0.711513 -0.513856 -1.327523 0.617366 -0.141189 -0.029523 -0.698260 -0.986462 -1.160025 -0.527912 -0.217723 -0.114165 -0.383973 -0.408623 1.743436 -0.080744 -0.345404 -1.353067 -0.420340 -0.299141 -0.237448 -0.126784 0.455732 -0.716513 -0.346456 -0.057553 -0.182371 0.024665 -0.396663 -0.187065 0.351850 -1.103425 -0.428842 0.749396
301 -0.280478 0.269672 -0.246596 -0.066403 1.009594 0.261868 -0.084117 1.006537 -0.106195 0.602253 1.281474 0.324822 -0.833754 -0.184975 -0.130952 1.168262 -0.653137 0.248119 1.702530 0.118704 -0.043833 0.157247 0.113391 1.465062 -1.327523 0.938632 2.827114 -0.181093 0.201550 1.385434 0.524137 -0.816178 0.446130 -0.114165 -0.445940 0.370818 1.076241 0.543675 1.161939 -1.505915 -0.420340 -0.304459 1.422303 0.500283 -0.225430 -0.488915 0.198833 -0.043785 0.471225 3.878031 0.587232 1.942589 -0.516831 -0.817729 -0.821841 0.749396
302 0.000841 -0.192134 -0.246596 -0.429958 -0.203506 -0.254746 -0.084117 0.779465 -0.106195 0.123915 -0.035009 -0.361816 -0.670055 0.030641 -0.130952 0.025818 0.162121 -0.566214 -0.121908 -0.581181 -0.120380 0.817345 -0.182907 0.787497 0.955064 0.381772 -0.734164 -0.181093 -0.576664 2.131787 0.889733 0.833745 -0.217917 -0.114165 -0.366198 0.289347 0.603729 0.038271 -0.427051 1.102205 -0.420340 0.449741 -0.041844 -0.379963 -0.351930 0.600965 -0.480085 -0.049670 -0.239782 1.709342 2.481794 0.525165 0.253509 -0.352712 1.301265 -1.334408
303 0.503805 -0.851213 0.336837 0.768085 -0.203506 -0.201759 -0.061444 0.063663 -0.106195 -0.154061 -0.035009 -0.245894 0.659674 -0.072953 -0.130952 1.973864 -0.699103 1.058291 -0.417954 1.602461 -0.166308 0.767587 -0.727042 -0.954811 0.950444 -1.390859 0.113266 -0.181093 -1.136006 -0.773929 -1.044125 1.196316 -0.189577 -0.114165 0.243844 -0.093685 1.033654 -0.257897 -0.232353 -1.484982 -0.083245 -0.117538 -0.129222 -0.261730 -0.643275 -1.482575 -0.307633 -0.063815 0.056103 1.127282 -0.396663 -0.671545 -0.302268 -0.965228 -0.801776 0.749396
304 0.546429 -0.244217 -0.246596 -0.349231 -0.203506 -0.731620 -0.084117 -0.175439 -0.106195 -0.801980 0.066259 0.865312 -0.921885 -0.240414 -0.130952 -0.292526 0.646675 0.396329 1.387835 2.386332 -0.166308 0.247501 -0.192692 0.206728 0.804231 0.516190 -0.838757 0.286261 -0.917133 0.494067 -0.532706 1.712868 -0.074082 -0.114165 -0.519264 -0.250546 -0.787444 -0.011360 -0.094180 1.102205 -0.420340 -0.074593 -0.346609 -0.176306 -0.170649 0.645374 -0.103896 -0.039793 -0.195620 -0.838176 -0.396663 -0.462933 -0.659500 -0.841578 -0.822083 0.749396
305 -0.109982 -0.857542 -0.246596 -0.292432 -0.203506 0.483747 -0.084117 0.716306 -0.072789 -0.456743 -0.440081 -0.804785 -0.900958 0.164522 -0.130952 -0.349793 -0.165045 -0.360178 -0.340268 0.622622 -0.166308 -0.242411 0.127665 0.045403 1.144539 0.160617 -0.819332 0.060820 1.295914 0.061398 -0.366695 0.966240 -0.217917 -0.114165 0.114726 -0.423215 0.106881 0.120775 -0.427051 -1.223824 -0.420340 0.064770 -0.389190 -0.379963 -0.340575 -0.314979 -0.480085 -0.065843 -0.208868 -0.123302 -0.396663 7.025504 -0.691536 -0.354671 1.301265 -1.334408
306 -0.271953 -0.939533 -0.246596 -0.123116 -0.203506 -0.042801 -0.084117 1.307294 0.012763 1.195229 0.167527 0.369678 -0.476585 -0.183040 -0.130952 -0.437169 -0.041256 -0.550308 0.065080 -0.525190 -0.166308 -0.246646 0.718189 -0.718201 0.802881 -0.451290 -0.661866 -0.181093 -0.090280 -0.283526 -0.474147 0.050412 0.275314 -0.114165 -0.431868 1.213490 0.005484 0.240680 1.274990 -0.852205 -0.420340 -0.304459 -0.257752 0.222962 -0.384486 0.967342 -0.107273 -0.054441 -0.098464 -0.564527 -0.396663 -0.684138 0.148090 2.937932 -0.817124 0.749396
307 5.166878 5.064144 -0.246596 1.573374 -0.203506 12.223463 -0.084117 3.087776 0.155623 0.343894 0.167527 11.556092 2.648425 1.319458 1.062621 -0.198985 3.701336 0.740912 0.264791 1.714442 -0.009386 3.125837 1.268079 -0.244982 -1.327523 -1.101575 -0.150538 -0.181093 -0.771218 1.693654 0.017156 4.557038 -0.149785 -0.114165 -0.266211 0.848697 -0.876674 1.833202 -0.131863 1.102205 -0.036528 -0.119774 1.090678 0.108443 0.395871 0.843366 0.178676 -0.013777 0.064935 1.334969 0.895995 -0.232207 3.772190 0.483464 -0.813039 0.749396
308 -0.382776 1.778794 -0.246596 -0.416180 0.731865 -0.665387 -0.084117 -0.022053 -0.070300 -0.315879 -0.338813 -0.299817 -0.231516 -0.198944 -0.130952 0.149600 -0.349463 0.346232 -0.595926 -0.357217 -0.166308 -0.539641 -0.633916 0.163708 -1.327523 -0.550238 -0.506160 -0.181093 -0.455068 1.890980 0.590283 -0.953150 0.302489 -0.114165 1.003002 0.012105 0.721350 -0.524810 0.232411 1.102205 -0.420340 -0.304459 -0.354494 0.009400 -0.128366 -1.121749 0.183987 -0.054824 -0.023388 -0.247201 -0.396663 -0.736256 1.111388 2.117771 -0.826841 0.749396
309 0.738237 1.079636 -0.246596 0.499403 -0.203506 -0.668699 -0.084117 1.301279 -0.106195 -0.489937 -1.655296 0.036011 1.152095 -0.072048 0.007083 -0.294166 0.757233 0.052192 -0.865418 1.784431 -0.062969 -0.733383 0.491994 -2.288430 -0.453805 -0.174507 -0.326677 -0.181093 -1.512953 -0.162508 -2.154872 0.733030 -0.210929 -0.114165 -0.798733 0.542879 -0.706326 -0.378339 -0.383087 -0.817417 -0.420340 -0.304459 -0.175745 -0.292062 -0.539451 -0.549978 -0.185487 -0.051393 -0.235365 0.925833 0.042188 1.417248 -0.286251 -0.051280 -0.803698 0.749396
310 -0.570322 -0.448519 -0.246596 -0.121042 1.374380 0.006873 -0.084117 0.105769 -0.080303 -0.411690 -2.161636 -0.190404 0.142665 0.161128 -0.130952 0.187581 -0.001196 0.621244 1.782809 -0.889130 -0.135689 0.858370 -0.641744 1.561857 1.051636 1.867056 -0.271452 -0.181093 0.006997 1.190035 0.883147 1.404906 -0.185695 -0.114165 -1.175966 -0.049910 -0.440665 -0.219032 -0.257475 -0.651349 -0.420340 -0.304459 -0.408262 -0.249969 -0.722186 -0.864545 -0.248495 -0.040993 20.167424 0.197589 -0.288891 1.773193 -1.372475 1.741506 -0.808787 0.749396
311 -0.612946 -0.723910 -0.246596 -0.241248 -0.203506 -1.112457 -0.076763 -1.739376 -0.106195 0.713820 -0.338813 0.569926 -0.306099 -0.210221 -0.130952 -0.220422 0.240127 -0.549267 0.794263 0.062714 -0.166308 0.328756 0.005646 0.529378 0.733804 -0.925775 -0.758224 -0.181093 0.128593 0.105154 1.117309 -0.998807 -0.217917 -0.114165 0.158423 0.008457 0.137301 0.080726 -0.427051 1.102205 -0.420340 0.094809 -0.375588 -0.379963 -0.325774 0.626870 -0.480085 -0.055845 -0.239782 0.436475 0.509412 -0.405838 -0.619270 0.589280 1.301265 -1.334408
312 0.103138 -0.073304 -0.246596 -0.240428 -0.203506 1.179189 -0.084117 1.086237 -0.106195 -0.439320 -0.642617 -0.127270 0.102005 -0.040062 -0.130952 -0.290985 1.011332 0.682490 -0.217749 -0.273231 -0.166308 -0.451769 0.061936 1.443552 0.676701 0.291352 0.288088 -0.181093 0.955445 -0.196267 1.007565 -0.424960 -0.106110 -0.114165 2.912624 0.289347 0.134259 0.056349 -0.056496 -0.100238 0.216755 -0.243202 -0.194621 -0.051264 -0.556576 -0.055924 -0.018493 -0.065843 0.042854 1.683047 -0.074591 -0.696235 0.075079 -0.314526 -0.817618 0.749396
313 0.026415 -0.669710 -0.202178 -0.383094 -0.203506 0.228752 0.089912 -1.428092 -0.106195 -0.957925 -0.237545 -0.690407 -0.924110 -0.178580 0.343077 0.510255 -0.888112 -0.312311 -0.671265 -0.049268 -0.166308 -0.901186 0.900066 -2.073330 -1.327523 0.887850 -0.532073 -0.181093 -2.643796 -0.736759 -1.958652 -0.159074 -0.217723 -0.114165 -1.037466 1.639081 -0.237870 -0.114290 -0.219792 1.102205 1.745231 -0.304459 -0.017794 -0.215923 -0.401117 -0.770175 -0.454334 -0.056164 0.078184 -0.663468 -0.396663 -0.249938 -0.694888 -1.029780 0.147861 0.749396
314 -0.604421 -0.651361 -0.246596 -0.305995 -0.203506 -0.619024 -0.084117 -0.471685 -0.106195 -0.662127 -2.161636 -0.251564 -0.184733 -0.154993 -0.130952 -0.325915 -0.844147 -0.353191 -0.521328 0.006723 -0.166308 -0.468973 -0.394253 0.163708 -1.327523 -0.074494 -0.489220 -0.181093 0.614977 -0.970662 0.450473 0.318088 -0.068259 -0.114165 -0.142771 -0.698026 -0.343323 -0.248140 -0.156986 1.102205 0.366524 -0.304459 -0.440148 -0.207876 -0.492202 -0.116987 -0.071165 -0.065843 0.162092 -0.765975 -0.396663 -0.688312 -0.950799 -1.108530 -0.824348 0.749396
315 -0.493599 1.336542 -0.246596 -0.427280 -0.203506 0.672510 -0.084117 0.017046 -0.106195 -0.066485 -0.845152 -0.202495 -0.991257 0.169337 -0.130952 -0.111937 0.252750 0.461291 -0.718568 -0.721158 -0.166308 0.698242 0.170947 3.336430 -1.327523 -0.164816 2.056805 -0.181093 1.514787 -0.212427 0.152957 -1.363168 -0.080876 -0.114165 0.786735 0.528895 0.599673 -0.062786 0.219850 -0.970230 -0.420340 -0.048763 -0.171458 0.165393 -0.621497 2.249663 -0.007569 -0.065843 0.047271 0.009511 -0.396663 -0.699774 -0.159599 -0.018409 -0.811414 0.749396
316 1.935973 -0.235938 0.140334 0.035489 -0.203506 2.815132 -0.084117 2.180993 -0.066542 0.757860 1.180206 -0.212997 -0.141526 -0.124353 -0.130952 -0.039648 -0.387156 -0.150871 -0.020386 -0.217240 0.159017 0.284556 1.311361 -0.642916 0.628023 0.539158 -0.189455 -0.181093 -0.479387 -0.258216 -2.887210 -1.210978 -0.217723 -0.114165 -0.195110 0.265028 0.080518 0.541272 -0.345404 -1.457514 -0.420340 -0.183953 8.768070 -0.167639 0.103582 0.130966 -0.368078 -0.029925 -0.067550 2.311013 -0.396663 4.835234 -0.093666 -1.140912 -0.428842 0.749396
317 0.171337 -0.354252 -0.021431 -0.280683 -0.203506 1.755412 -0.084117 1.214059 -0.106195 -0.768804 0.977670 -0.465382 0.692262 -0.137795 -0.130952 0.121262 -0.311971 -0.675327 -0.729807 0.174695 -0.166308 0.671774 -0.107394 -2.256165 -1.327523 -0.301560 1.665088 -0.057142 -1.306240 -0.986462 -1.843325 -0.645188 -0.217723 -0.114165 0.518129 -0.698026 0.477996 -0.272786 -0.420771 -1.355878 -0.420340 -0.299793 -0.275001 -0.303824 -0.043203 -0.607340 -0.100098 -0.057093 -0.098464 0.564831 -0.396663 -0.097925 1.037632 -1.046775 -0.774863 0.749396
318 -0.212280 -0.670562 0.057801 -0.481142 -0.203506 -0.476624 -0.051640 1.636623 -0.106195 0.761624 -0.237545 -0.181491 -0.121360 -0.183817 -0.130952 0.397493 -0.163850 0.114924 -0.748827 -0.441204 0.185809 -0.397511 0.144241 -2.180880 0.535679 0.775916 -1.134626 0.983216 -1.403517 -0.986462 -1.442794 0.583078 0.182142 -0.114165 -0.642704 -0.698026 -1.034854 0.092420 1.513653 -1.032880 0.163959 0.297277 -0.440592 0.666799 0.064463 -0.061475 0.026641 -0.065843 -0.177955 2.015972 -0.396663 -0.336287 -0.641993 -1.152521 -0.810889 0.749396
319 3.095348 0.692911 -0.246596 8.237573 -0.203506 1.086463 -0.084117 2.180993 -0.052827 1.431439 1.382742 2.731575 2.648425 -0.156350 -0.130952 -0.221812 1.587331 -0.119208 -0.050522 1.238521 -0.028523 -1.007056 1.408516 -0.234227 -1.327523 -0.852703 -0.610403 -0.181093 -1.354878 -0.814635 -1.744176 0.695878 0.346164 -0.114165 -0.053894 -0.044438 -0.702270 0.253975 1.676948 1.102205 -0.420340 -0.143949 -0.172838 0.110300 -0.326892 1.518758 -0.219551 -0.042875 0.303410 0.555917 1.131018 0.514346 0.996284 -1.296523 -0.815886 0.749396
320 -0.783442 1.488934 -0.246596 -0.464340 -0.203506 -1.085964 -0.084117 -0.504768 -0.106195 -0.493987 0.977670 -0.570889 -0.386371 -0.275545 -0.130952 -0.162709 -0.132394 -0.395781 -0.444754 -1.281066 -0.166308 0.455006 -1.506004 0.142198 -1.327523 0.061960 0.738965 -0.181093 -0.600983 -0.657295 1.574465 -0.466140 -0.217723 -0.114165 -0.746888 0.149510 -0.491364 -0.435377 -0.219792 0.197006 -0.081243 -0.304459 -0.302797 -0.124927 0.508232 0.294725 -0.434504 -0.052583 0.056103 0.539873 -0.317823 -0.769738 2.110818 -0.446359 0.147861 0.749396
321 11.338843 7.629372 0.833386 0.634878 -0.203506 4.639838 -0.069410 -0.434090 -0.034856 0.512264 -0.642617 4.163077 2.648425 -0.077653 0.379923 3.300703 1.388909 10.190183 3.161142 3.646125 0.522616 -0.293494 0.678360 -0.427817 -1.327523 -1.361495 0.010541 -0.030355 -0.503706 1.840616 -1.015562 4.768314 -0.217917 0.904072 -0.549383 3.222283 1.666374 2.740571 -0.427051 0.467671 0.114087 0.593101 0.270689 -0.379963 -0.382109 0.882224 -0.480085 0.842809 0.295460 1.011404 1.215947 -0.116116 1.813187 -0.330052 1.301265 -1.334408
322 -0.323102 -0.890454 0.268710 -0.358776 -0.203506 0.611245 -0.084117 0.170432 -0.106195 -0.134985 -0.541349 -0.269723 -0.166267 -0.130953 -0.130952 -0.069036 -0.180241 -0.237240 0.499576 -0.609176 0.067161 -0.590194 -0.500041 -0.105167 -1.327523 -0.796978 0.614505 -0.181093 -0.284834 -0.325460 0.706613 -0.720388 -0.148814 -0.114165 0.040291 -0.698026 -0.982127 -0.262192 -0.370526 -0.703879 -0.420340 -0.298490 2.542109 -0.358297 -0.304903 -0.201180 -0.166029 -0.065843 -0.116128 0.568397 0.053011 -0.651186 0.319815 -1.111713 -0.828528 0.749396
323 -0.629996 -1.594110 -0.246596 -0.459848 -0.203506 -0.205071 -0.032644 -1.739376 -0.090262 0.633102 0.977670 0.996899 0.281522 -0.036164 -0.130952 1.170361 -0.268195 0.099316 0.139184 -0.945121 0.109262 0.568022 -0.351777 -1.212931 -0.308045 0.233786 0.193490 -0.181093 -1.671028 -0.230641 0.070488 -1.928959 0.386927 -0.114165 -0.589872 -0.257842 -0.243954 -0.411502 0.194728 -0.892823 0.207135 -0.173635 -0.417823 0.915026 -0.322086 -0.166947 1.957253 -0.055477 0.113513 -0.082299 -0.267420 -0.600176 -1.730824 -0.243539 -0.827912 0.749396
324 -0.340152 -0.525484 -0.246596 -0.384390 -0.203506 -0.446820 -0.071248 -1.739376 -0.106195 -0.350913 -0.237545 -0.119051 0.052760 0.213809 -0.130952 0.309527 -1.220201 -0.736424 0.802291 -0.441204 -0.166308 2.048611 0.384595 -0.944056 -1.327523 0.845111 0.210335 -0.181093 -0.479387 -0.631923 0.316462 -1.741854 -0.011967 -0.114165 0.443817 0.298467 0.362403 0.420103 0.244972 -1.208176 -0.173137 -0.304459 -0.195064 -0.121213 0.035253 0.140218 -0.206765 -0.046302 -0.058718 -0.157174 -0.396663 -0.475659 0.736648 -0.800594 -0.817912 0.749396
325 0.716925 0.032537 -0.246596 -0.300379 0.229214 -0.181890 -0.084117 0.132837 -0.106195 -0.088360 -1.047688 -0.106982 -0.920843 0.054784 -0.130952 1.428093 -0.297722 3.460262 -0.152785 -0.049268 -0.166308 -0.705857 -0.584188 -2.223900 0.718061 -1.206628 0.634789 -0.181093 -2.157412 0.720747 -1.257526 -0.844826 -0.217917 -0.114165 -0.747135 0.121543 -0.256122 -0.115471 -0.427051 -1.176996 -0.420340 -0.030345 -0.282246 -0.379963 -0.330268 -1.047733 -0.480085 -0.050144 -0.124961 -0.717842 -0.127273 -0.098786 -1.493166 1.197599 1.301265 -1.334408
326 -0.442450 0.024138 -0.246596 -0.142336 -0.203506 -0.373964 -0.084117 -1.739376 -0.106195 -0.239100 0.673866 0.474700 -0.235121 -0.188904 -0.130952 -0.289903 1.358428 0.293013 -0.095478 0.202691 0.067161 -0.920508 -0.057896 0.497113 2.003951 0.126891 0.696341 0.672756 -0.309153 0.174029 -0.705374 0.253631 -0.217917 -0.114165 -1.400381 0.610365 -0.012768 -0.004322 -0.427051 1.102205 -0.420340 4.908665 1.179239 -0.379963 -0.229471 0.236438 -0.480085 -0.065843 0.378485 -0.524416 0.414089 -0.512065 0.103017 3.620036 1.301265 -1.334408
327 -0.468024 -0.059037 -0.246596 -0.369099 -0.203506 -0.658764 -0.084117 -0.043106 -0.067176 -0.702778 -0.845152 -0.545583 -0.823868 -0.210069 -0.130952 -0.253496 -0.156063 -0.086355 0.335930 -0.861135 -0.166308 0.519851 -0.272004 -0.492346 0.533659 -0.003360 1.181850 -0.181093 0.225869 -0.211516 -0.827001 0.357478 -0.140079 -0.114165 -0.731334 -0.698026 1.735325 -0.147554 -0.345404 -0.145426 -0.420340 -0.304459 -0.405601 -0.051264 -0.274739 -0.327931 1.216214 -0.057519 -0.094047 -0.622466 0.235272 -0.444739 0.547415 -0.503079 -0.827549 0.749396
328 0.427081 -0.357619 0.093965 -0.135037 -0.203506 0.495338 -0.037546 2.289266 -0.106195 0.168643 0.268795 0.557637 -0.496498 0.102068 -0.130952 -0.213482 0.041586 -0.539010 0.199826 0.174695 -0.166308 -0.255381 -0.464932 0.077668 0.746792 0.104698 2.594341 -0.181093 -0.138918 -0.053711 -0.561913 -0.336331 -0.136197 -0.114165 1.296667 0.203013 1.810359 0.167501 -0.207230 0.099984 0.119706 -0.304459 -0.246269 -0.044455 -0.616363 -0.669328 0.198527 -0.052685 -0.230949 -0.066255 -0.396663 -0.749552 -0.018792 0.099750 -0.820321 0.749396
329 0.512329 1.405114 0.375103 0.294863 -0.203506 -0.132215 -0.084117 1.319324 -0.106195 -0.642063 1.787814 -0.182638 -0.496329 0.220890 -0.130952 -0.351695 -0.079813 -0.708923 -0.637672 0.174695 -0.020868 1.524556 0.242662 -0.707446 -1.327523 2.330880 -0.529171 -0.181093 -0.187557 -0.410071 0.047795 -1.264692 0.491746 -0.114165 0.773156 -0.180020 0.914005 0.236322 2.191957 -0.817370 -0.420340 -0.272625 -0.193684 1.080923 0.126925 2.096080 0.218614 -0.058158 -0.164706 -0.948705 -0.396663 -0.547987 2.674418 0.877844 -0.816067 0.749396
330 -0.246379 -0.793837 0.003104 0.959818 -0.203506 -0.476624 -0.084117 -0.763419 -0.106195 -0.266447 -0.541349 -0.506530 -0.560598 -0.081205 -0.130952 -0.407387 -0.416423 3.481669 -0.739935 -0.441204 0.113089 0.400748 -0.041780 0.120688 0.662512 -1.142084 -0.842746 -0.181093 0.420423 0.468800 -0.139334 -0.325588 -0.176960 -0.114165 0.245572 0.015753 0.404990 -0.192433 -0.332842 -0.212371 0.012035 0.003179 -0.246368 -0.371916 -0.433144 0.391871 -0.378404 -0.057384 -0.239782 -0.311380 -0.170694 0.503464 -0.489266 0.007817 -0.821804 0.749396
331 0.580528 -0.607852 0.860247 -0.124109 -0.203506 -0.370652 -0.084117 -1.739376 -0.106195 0.666485 0.572598 1.289395 -0.950698 -0.166298 -0.130952 -0.406338 -0.706866 -0.629393 -0.366204 0.790594 -0.166308 -0.144482 0.448827 -2.094840 -1.327523 0.560672 2.253899 -0.181093 -1.087367 -0.986462 -1.380299 -1.316616 -0.111933 -0.114165 0.896103 -0.698026 -0.517727 0.363033 0.056555 -1.139975 -0.420340 -0.074435 1.562463 0.172202 -0.430907 0.782303 0.084112 -0.065843 -0.032221 0.321489 -0.396663 0.393390 0.776133 -1.225676 -0.812086 0.749396
332 -0.638520 1.034966 -0.246596 -0.383008 0.025413 -0.701815 -0.084117 0.286223 -0.039021 0.109213 -2.667975 -0.151374 -0.330941 -0.236389 -0.130952 -0.069233 1.747263 -0.479845 -0.221455 -0.777149 -0.166308 0.588667 -0.287199 0.905802 0.748353 0.257917 0.216797 -0.181093 0.031316 0.225071 1.426422 -0.475093 -0.175019 -0.114165 -0.678749 -0.378224 -0.633320 -0.160133 -0.144425 1.102205 -0.151895 -0.035804 -0.311421 -0.021551 0.518736 0.623169 -0.076123 -0.065843 -0.018972 -0.468260 0.394738 -0.673234 -0.419235 2.707697 -0.803057 0.749396
333 -0.246379 -0.059782 -0.113155 0.166688 -0.203506 -0.347471 -0.084117 0.342615 -0.106195 -0.296922 0.167527 -0.279750 0.495131 -0.072998 -0.130952 0.615637 -0.206786 0.467535 -0.675218 -0.595178 -0.166308 0.587211 0.105333 -0.761221 0.717489 1.490453 -0.736710 -0.181093 -0.029482 -0.035814 0.218101 -0.266055 -0.137168 -0.114165 -0.668256 -0.368496 0.773063 -0.704826 -0.081619 -0.923442 -0.420340 0.015012 -0.326847 0.092968 -0.034041 -0.017066 0.114190 -0.045450 0.206253 -0.392939 -0.119207 -0.250286 0.347753 -0.837102 -0.813310 0.749396
334 -0.271953 0.111590 0.175373 -0.278567 -0.203506 -0.304420 -0.084117 1.120824 -0.106195 0.805116 0.268795 -0.396389 -0.384624 0.116861 -0.130952 -0.248642 0.933402 -0.515225 -0.224419 -0.637172 -0.059142 0.911570 -0.386540 0.335788 0.726678 -0.230911 1.054004 -0.181093 2.365959 -0.707892 0.805976 -0.241436 -0.031378 -0.114165 0.905978 -0.466990 1.380433 0.039197 0.119360 -1.236107 -0.420340 -0.304459 -0.110150 0.155489 -0.438432 0.316005 0.165686 -0.060060 -0.230949 -0.330098 -0.396663 -0.503533 0.308640 -1.169166 -0.819385 0.749396
335 -0.357202 -0.921263 -0.246596 -0.375622 -0.203506 -0.453443 -0.084117 0.418557 -0.084785 0.208401 -0.035009 -0.271267 0.654295 -0.114712 -0.130952 0.821645 0.315970 -0.277377 0.273189 -0.385213 -0.166308 1.500470 -0.445708 0.013138 0.863279 -1.015710 3.646174 -0.181093 0.104274 -0.373833 -0.121079 -0.045379 -0.195400 -0.114165 -0.452112 -0.698026 -0.377798 0.362682 -0.244914 -0.333810 -0.420340 -0.038498 -0.192206 -0.245636 -0.242576 -0.218758 -0.296852 -0.044071 -0.239782 -0.216895 -0.396663 0.341291 0.669224 -0.792271 -0.796489 0.749396
336 -0.544747 -0.485661 -0.246596 -0.390264 -0.203506 -0.678634 -0.030805 0.597507 -0.044905 0.252441 0.066259 -0.098024 0.718146 0.090443 -0.130952 -0.297315 0.503974 -0.380990 1.641641 -0.301227 -0.166308 1.053170 0.659137 -0.137432 0.866556 0.244737 2.347100 -0.181093 1.368872 -0.157001 0.363925 0.147097 0.584918 -0.114165 0.131760 -0.114357 -0.041159 -0.189947 0.910715 -1.019441 -0.323291 -0.304459 -0.389732 0.635848 -0.413150 -0.253916 0.546024 -0.045068 -0.094047 0.269790 -0.396663 -0.639130 -0.321638 0.358975 -0.825081 0.749396
337 -0.135556 0.447931 -0.246596 -0.250924 -0.203506 -0.380587 -0.084117 -1.739376 -0.046897 -0.029241 0.471331 -0.346702 0.105712 -0.210378 -0.130952 0.353542 -0.125956 -0.130060 -0.581600 0.034718 0.009750 -0.049199 -0.207426 -0.169697 -0.351823 0.843948 -0.095703 -0.181093 -1.476474 -0.986462 0.237144 2.622429 -0.217723 -0.114165 -0.197332 -0.169684 1.376378 0.242559 -0.295159 -1.177366 -0.420340 -0.288859 -0.368343 -0.197352 -0.394434 0.254942 -0.424902 -0.057645 -0.041053 1.362602 -0.396663 0.777395 -0.548121 0.534589 -0.198161 0.749396
338 -0.749343 0.247105 -0.246596 -0.395490 -0.154411 -1.009796 -0.003230 0.236599 -0.084106 -0.748183 -2.566707 -0.372053 -0.844598 -0.068578 -0.130952 -0.309057 -0.155909 -0.391098 -0.473408 -1.113093 -0.166308 -0.102134 0.259353 -1.772190 0.456367 -0.008981 -0.900778 -0.181093 -1.549432 -0.145416 -0.849980 -0.490312 -0.210929 -0.114165 -0.393849 0.018793 -0.805696 -0.458617 -0.408210 -0.856323 0.237689 -0.067568 -0.360458 -0.292062 -0.270486 -1.180961 0.125851 -0.050633 -0.129377 -0.528872 0.439865 0.635109 -1.020830 -0.515597 -0.818116 0.749396
339 -0.485074 0.066505 -0.246596 -0.106184 -0.203506 -0.324290 0.012089 -1.062672 -0.106195 0.100106 -0.946420 -0.582561 -0.357360 -0.131879 0.330070 -0.437169 0.002520 0.178846 0.402253 -0.161250 -0.166308 -0.062697 -0.098530 0.765987 0.825389 -0.287895 -0.885827 -0.181093 0.760892 -0.239261 0.822369 -1.223511 -0.100287 -0.114165 0.129785 -0.370928 -0.617096 -0.238579 0.031432 0.111853 0.396078 -0.304459 0.410623 0.118348 -0.437992 -0.055924 0.088491 -0.065843 -0.239782 -0.155391 -0.396663 -0.632980 -0.034437 -0.527976 -0.815043 0.749396
340 0.316259 0.196256 -0.224800 0.036180 -0.203506 -0.946875 -0.084117 0.183966 -0.088904 -0.475840 1.889081 -0.119845 1.844830 -0.193700 -0.130952 0.403823 -0.559455 -0.349921 -0.580488 -0.497194 -0.166308 -0.308051 -0.717142 -1.245196 -0.322855 -1.057285 0.222795 -0.181093 -0.600983 -0.600937 0.070059 -1.224407 0.804261 -0.099966 0.791672 -0.142932 0.032861 -0.225414 0.496196 -1.479199 -0.323291 -0.304459 -0.344687 0.232247 -0.548777 -0.722064 0.548498 -0.042793 -0.239782 -0.256115 -0.396663 -0.590052 -0.342871 -0.556791 -0.828484 0.749396
341 0.256585 -0.760143 -0.203304 -0.317225 -0.203506 -0.811099 -0.084117 -1.739376 -0.066542 -0.616526 0.775134 -0.103033 -0.755538 -0.041104 -0.130952 0.595925 -0.888432 -1.250474 0.005797 -0.245236 -0.166308 -0.169891 1.003897 -0.546121 0.798529 -1.146542 -0.535082 -0.181093 -0.163238 -0.599730 0.930394 -0.314845 -0.162402 -0.114165 -1.400381 -0.119221 -0.752969 0.663241 0.081677 -1.242282 -0.420340 -0.001434 -0.159876 0.201296 -0.284469 -2.200526 -0.075544 -0.053841 -0.222117 -0.944248 -0.396663 -0.616725 -1.183615 -0.884170 -0.791766 0.749396
342 2.021221 -0.492644 0.200958 1.176690 -0.203506 -0.314355 -0.084117 -1.479221 -0.106195 -0.791392 -0.237545 -0.511075 -0.719317 0.420931 -0.130952 0.146386 0.094830 2.264331 -0.019521 1.014557 -0.078279 8.085044 -0.916170 -0.589141 0.866624 -0.950778 0.434632 -0.181093 -0.552345 1.309613 0.223900 -1.002388 -0.197341 -0.114165 0.713411 -0.325937 1.416937 -0.399986 -0.420771 -1.035836 -0.027216 -0.043304 -0.414521 -0.360154 -0.316837 -1.752732 0.385181 -0.033381 -0.239782 0.942769 -0.396663 -0.737412 -1.132954 -0.545391 -0.829912 0.749396
343 0.290684 -0.461481 -0.018730 0.240353 -0.203506 0.659263 -0.084117 -0.152882 -0.074917 -0.071292 0.673866 -0.320535 2.482726 -0.068536 -0.130952 -0.335362 -0.302503 -0.423059 0.115965 1.462484 -0.166308 0.400483 -0.184979 0.486358 0.706322 1.172289 0.776162 -0.181093 -0.260514 -0.176126 -0.228174 0.651116 -0.143961 -0.114165 -0.838727 -0.538125 0.325900 0.776923 -0.119302 1.102205 -0.420340 -0.296236 0.095114 -0.017218 -0.109091 0.378918 0.015103 -0.056392 0.391733 -0.096561 -0.396663 1.280806 0.644639 -0.913544 -0.813809 0.749396
344 -0.382776 -1.215350 0.031503 -0.446156 -0.203506 -0.171955 0.116874 -0.325817 -0.106195 -0.298708 -0.440081 -0.082448 -0.811391 -0.321829 -0.130952 -0.211153 -0.644995 -0.836915 0.379775 -0.357217 0.478602 -0.758130 0.339010 -0.793486 -1.327523 0.324205 0.594839 -0.090917 -0.114599 -0.158441 -0.088435 0.716468 -0.217917 -0.114165 1.888440 -0.353904 -0.891884 -0.393984 -0.427051 1.102205 1.087202 0.042972 -0.131539 -0.379963 -0.431827 0.502894 -0.480085 -0.054548 -0.239782 -0.409875 -0.396663 -0.375151 -0.040770 -0.878435 1.301265 -1.334408
345 -0.331627 -0.297297 -0.246596 -0.321371 -0.203506 -0.734931 -0.084117 0.532844 -0.034404 -0.127372 1.281474 0.067408 -0.476839 -0.016403 0.674000 -0.008424 0.003207 -0.064503 0.082618 -0.161250 1.134994 -0.472149 0.709901 -0.293379 0.686438 0.525784 1.608184 7.363788 -0.309153 -0.267133 0.008387 -1.582949 0.356840 2.401001 0.090038 0.081415 1.917840 0.104192 0.508757 -0.935464 -0.420340 1.554612 -0.352474 0.359765 -0.158666 -0.981119 0.430179 -0.040436 0.431479 0.252854 -0.396663 0.102344 0.315904 -0.393556 -0.825239 0.749396
346 -0.254904 0.980072 -0.246596 -0.394195 -0.203506 0.745366 -0.084117 -0.099498 -0.091212 0.229118 -1.047688 -0.419787 0.106906 -0.321829 -0.130952 -0.152771 -0.149980 0.005068 -0.249614 -0.833139 -0.166308 -0.548905 -0.675011 -0.234227 -1.327523 -0.114906 -0.795280 -0.181093 0.420423 -0.224203 0.138926 -1.781244 -0.087670 -0.114165 2.390471 -0.698026 -1.053106 -0.344905 0.056555 -0.096743 -0.420340 -0.076037 -0.260364 0.006305 -0.292970 -0.327931 -0.036348 -0.065843 0.175340 -0.867591 0.610920 -0.285272 -1.420155 -0.567841 -0.815873 0.749396
347 0.128713 -0.114579 -0.246596 -0.090764 -0.203506 0.136026 -0.084117 -0.862669 -0.106195 -0.766610 0.977670 0.095847 0.476429 0.114691 0.277992 0.711048 -0.540851 0.285283 -0.337427 -0.077263 -0.135689 -0.336371 -0.648190 -0.868771 -1.327523 -1.399775 -0.518263 -0.181093 -0.746898 -0.463654 0.084806 0.629630 0.105469 -0.114165 1.362954 -0.459086 -0.227730 -0.471571 -0.043935 -1.525102 -0.420340 -0.304459 -0.291560 0.174059 -0.677212 -0.623994 0.780760 -0.041888 0.016357 -0.281965 -0.306489 -0.615656 -0.482561 -0.331801 -0.827180 0.749396
348 0.529379 0.005375 -0.246596 0.551731 -0.203506 -0.115657 -0.084117 0.400511 0.020006 0.802100 0.673866 0.738513 1.013266 -0.180975 -0.130952 0.167180 -1.161313 0.021717 -0.062502 0.146700 0.281493 -1.030083 0.124327 0.034648 0.714025 0.406000 -0.716083 -0.181093 1.149999 0.329187 0.477032 1.313592 0.037531 -0.114165 -0.546174 0.914359 -0.971988 0.470658 0.031432 1.102205 0.079916 -0.258996 -0.250902 0.144346 -0.215833 1.330019 0.406490 -0.045508 0.338739 0.350012 -0.396663 -0.663009 0.608879 0.083210 -0.824451 0.749396
349 1.057917 3.011353 -0.246596 0.181308 0.490102 1.227207 -0.084117 1.460680 -0.106195 -0.516352 -0.237545 -0.282872 2.648425 -0.198019 -0.130952 -0.147491 -0.076250 2.320448 -0.498850 -0.581181 -0.082106 -0.355560 -0.046154 0.271258 0.285464 -0.555374 -0.733318 0.624506 -1.938539 0.240955 -0.603792 1.199001 -0.217917 -0.114165 2.342206 4.000508 -0.095914 0.227301 -0.427051 1.102205 -0.420340 0.022160 0.200382 -0.379963 0.198942 -0.237262 -0.480085 -0.065843 -0.239782 -0.502132 -0.026595 -0.693543 -0.705318 0.963832 1.301265 -1.334408
350 -0.502123 -0.200629 -0.182220 -0.222978 -0.203506 -1.162131 0.104619 -1.739376 -0.047802 -0.032024 0.572598 0.216601 -0.998214 -0.245704 0.940805 -0.389414 0.248347 -0.726613 -0.525034 0.566631 -0.166308 0.488620 0.854021 0.045403 -1.327523 2.073480 0.624338 0.140015 -0.479387 -0.593165 -1.230896 -0.735607 -0.217917 -0.114165 2.233949 0.165318 -1.318767 -0.601780 -0.427051 1.102205 1.542204 -0.281499 0.123993 -0.379963 -0.305269 0.628721 -0.480085 -0.055109 -0.213284 0.452519 -0.396663 0.745757 0.448702 -1.059084 1.301265 -1.334408
351 -0.502123 -0.862642 -0.246596 -0.287767 -0.150701 0.679133 -0.084117 -1.739376 -0.106195 0.114524 0.471331 0.023545 -0.468192 0.116408 -0.130952 0.019061 1.413482 -0.236199 -0.671883 -0.917126 -0.001732 -1.302168 0.401171 2.303951 -1.327523 -0.370659 2.484496 -0.181093 -0.941452 0.937664 -0.736515 -0.542236 -0.210929 -0.114165 -0.087470 0.027304 -0.696186 -0.189690 -0.408210 1.102205 -0.420340 0.130024 -0.094823 -0.292062 -0.517690 0.754547 0.125851 -0.065843 -0.239782 1.210179 1.491475 1.550122 -1.134072 1.400909 -0.818116 0.749396
352 -0.365726 0.119300 -0.073727 -0.453369 -0.203506 1.126203 -0.084117 0.962927 -0.106195 0.643328 0.876402 0.249078 -0.382681 -0.124410 -0.130952 -0.323488 1.721559 0.104964 -0.449201 0.370663 -0.166308 -0.080695 0.006567 1.249962 0.761136 -0.029332 0.391243 -0.181093 -0.430749 0.013067 0.234137 0.373592 -0.217917 -0.114165 -0.276333 -0.279730 0.220447 -0.151338 -0.427051 -0.545353 -0.166518 -0.064821 -0.412895 -0.379963 -0.048083 -0.326081 -0.480085 -0.043863 -0.014556 0.270681 -0.396663 1.573674 0.272507 -0.895430 1.301265 -1.334408
353 -0.425400 -1.160537 0.184827 -0.316534 -0.203506 -0.291173 -0.043674 1.654668 -0.106195 -0.166640 -0.440081 -0.262773 -0.970020 -0.208600 -0.130952 2.404642 -0.011504 -0.135411 -0.410420 -0.273231 0.051851 0.598460 -0.646348 1.077882 -1.327523 0.107702 -0.679209 -0.181093 -0.236195 0.144039 -0.112417 -0.568198 -0.217917 -0.114165 -0.744913 -0.186099 -0.091858 -0.106566 -0.427051 1.102205 -0.420340 0.012300 -0.405059 -0.379963 -0.294788 1.004350 -0.480085 -0.065843 -0.239782 -0.500349 -0.396663 0.801103 -0.093666 -0.409502 1.301265 -1.334408
354 -0.382776 -0.603099 -0.246596 -0.433198 -0.203506 -0.685257 -0.084117 -0.085212 -0.106195 -0.369383 1.281474 -0.457329 -0.776972 -0.070817 0.283219 -0.156412 0.352823 -0.488467 -0.795142 0.286677 -0.166308 -0.079372 -0.407376 -1.890495 -0.789780 -0.817523 0.030436 -0.181093 -0.698260 -0.986462 -0.834875 0.495345 -0.139109 -0.114165 -0.183506 -0.698026 0.755825 -0.127167 -0.232353 -0.720383 -0.420340 -0.304459 -0.077475 -0.206019 -0.539865 0.160572 -0.067228 -0.049060 -0.200036 -0.672382 -0.396663 -0.705032 1.270447 -0.982922 -0.821280 0.749396
355 -0.693932 0.644535 -0.246596 -0.465981 -0.203506 -0.865741 -0.084117 -0.648379 -0.106195 -0.063994 -0.743885 -0.380150 -0.645537 -0.091876 -0.130952 -0.158708 0.047492 -0.785704 -0.383495 -0.553185 -0.166308 -0.436418 -0.375605 -0.519234 -1.327523 -1.089170 -0.744958 -0.181093 -0.211876 -0.799619 -0.154152 -1.741406 -0.174048 -0.114165 -0.728495 -0.466990 -0.467028 -0.278002 -0.169547 1.102205 -0.420340 0.084068 -0.389535 -0.070453 -0.172170 0.115237 -0.089475 -0.052873 -0.239782 0.714135 -0.396663 -0.764858 -0.650933 -0.653095 -0.806205 0.749396
356 -0.710981 0.427085 -0.246596 -0.398125 -0.203506 0.344659 -0.084117 1.299775 -0.038342 -0.351114 0.066259 -0.037206 -0.763396 -0.118883 -0.130952 0.083019 0.387995 -0.248240 -0.715357 -0.259234 -0.166308 0.870942 -0.260723 0.540133 -1.327523 -0.394209 -0.696813 -0.181093 0.760892 -0.090096 0.391557 0.324354 -0.217917 -0.043172 1.185570 0.042504 -0.714438 -0.215073 -0.427051 1.102205 -0.420340 0.146628 -0.354593 -0.379963 -0.162105 2.524446 -0.480085 -0.048983 -0.239782 0.179316 0.062511 -0.501507 0.183478 0.529448 1.301265 -1.334408
357 -0.323102 -0.155585 -0.246596 -0.407671 -0.203506 -0.562727 -0.084117 -0.040098 -0.106195 -0.774135 -0.136277 -0.250902 -0.901831 -0.006058 -0.130952 -0.148278 0.778204 -0.373706 -0.416348 -0.301227 -0.166308 0.502118 -0.868284 -0.336399 0.811847 0.457751 0.243005 -0.181093 0.128593 -0.238372 -0.488501 -0.817074 -0.217917 -0.114165 -0.149930 -0.698026 -0.416329 -0.320920 -0.427051 -1.164401 -0.239325 -0.289880 -0.098223 -0.379963 -0.472404 -0.975568 -0.480085 -0.065843 -0.239782 -1.028036 0.129779 0.035250 -0.129053 -0.065478 1.301265 -1.334408
358 0.384457 -0.991587 0.090214 -0.329621 0.085354 -0.367341 0.200212 0.481716 0.016520 -0.036718 -1.351492 -0.042733 -1.019591 -0.138448 0.080460 -0.075268 0.026189 -0.563984 -0.303833 -0.637172 0.166672 -1.111073 2.184255 -1.890495 1.254305 -0.375214 0.270517 0.354975 -1.622390 0.787165 -1.392612 -0.935245 -0.208988 -0.114165 -1.230774 0.370210 0.745685 0.153884 -0.219792 1.102205 3.411854 0.010469 -0.079791 0.177154 -0.378782 -1.334544 -0.091577 -0.003895 -0.239782 -0.181241 -0.396663 0.095786 -1.866416 -0.765555 -0.734494 0.749396
359 0.478230 1.818276 0.013383 0.187312 -0.203506 -0.385555 -0.084117 0.125318 1.422570 -0.408805 -0.035009 -0.427763 -0.661113 0.265379 0.064064 0.099090 -0.124606 0.047583 -0.529356 -0.581181 -0.093588 -0.745690 -0.332553 0.454093 -1.327523 -1.640119 -0.916630 0.141513 -0.309153 0.056188 0.112868 -2.596806 -0.118727 -0.114165 0.487762 0.286307 -0.951708 -0.287578 -0.232353 0.114348 0.222682 -0.052003 1.775661 -0.223351 0.311264 -0.404722 -0.056080 0.542591 -0.200036 -0.109040 0.339648 -0.774141 -0.669931 -0.110727 -0.823575 0.749396
360 -0.331627 -0.750290 0.212363 -0.413113 -0.203506 -0.234876 -0.084117 0.424572 -0.106195 -0.187129 -0.541349 -0.728401 -0.675970 0.005799 -0.130952 -0.437169 0.190855 -0.617500 0.160428 -0.329222 -0.085934 0.551877 -0.510746 -0.363287 0.760676 -1.055250 -0.513763 1.416462 -0.406430 -0.607143 -0.760138 0.166793 -0.217917 -0.114165 0.049055 -0.698026 2.400492 -0.297412 -0.427051 1.102205 -0.420340 0.086040 2.694246 -0.379963 -0.066642 -0.316829 -0.480085 -0.065843 -0.213284 0.069233 -0.396663 -0.551984 0.157030 -0.975299 1.301265 -1.334408
361 0.034940 0.471773 -0.246596 -0.198401 -0.203506 -0.105722 -0.084117 0.493746 -0.041556 -0.370638 1.382742 0.406546 -0.636397 -0.092445 -0.130952 0.098041 -0.041102 -0.041610 -0.830095 0.342668 -0.105070 -0.166185 0.315297 0.066913 0.862981 -0.536380 -0.381861 -0.181093 0.274508 0.503619 0.470804 1.495325 -0.217917 -0.114165 0.493194 -0.020727 0.041987 -0.046204 -0.427051 1.102205 -0.420340 -0.038656 0.284212 -0.379963 -0.420955 0.197580 -0.480085 -0.031871 -0.239782 0.232352 -0.184849 0.878685 0.671087 1.431332 1.301265 -1.334408
362 -0.365726 -0.361731 -0.246596 -0.413070 -0.203506 -0.721685 -0.084117 -1.739376 -0.106195 -0.056613 -0.237545 -0.371766 -0.368485 -0.237944 -0.130952 -0.437169 -0.201034 -0.588661 -0.381025 -0.329222 -0.166308 -0.300111 -0.776655 -0.180452 -1.327523 0.511441 1.704434 -0.181093 -0.868494 -0.601806 -0.500706 -0.930769 -0.217917 -0.114165 -0.022540 -0.698026 -0.663739 -0.292602 -0.427051 1.102205 -0.420340 -0.304459 -0.212215 -0.379963 -0.323926 0.138367 -0.480085 -0.055632 -0.133793 -0.604638 -0.396663 0.047059 0.371221 -0.980824 1.301265 -1.334408
363 -0.433925 -0.266775 -0.246596 -0.415359 -0.203506 -0.739899 -0.084117 0.478708 -0.106195 0.727170 -1.452760 -0.025700 0.100360 -0.154177 -0.130952 0.199028 0.462718 -0.640467 0.241695 -0.469199 -0.166308 0.081153 -0.539064 -0.519234 0.896457 -0.717994 -0.943597 -0.181093 0.213710 -0.799619 -0.170975 -1.386892 -0.096405 -0.114165 -0.595427 -0.065718 -0.296680 -0.064545 0.320339 1.102205 -0.228396 -0.040452 -0.365928 0.500902 -0.316636 1.320767 0.135125 -0.042227 -0.120545 -0.897006 -0.396663 -0.714551 -0.134455 -0.682399 -0.805071 0.749396
364 -0.246379 -0.426775 -0.161587 -0.345084 -0.203506 1.126203 -0.084117 1.367445 -0.050699 -0.013609 0.876402 -0.070908 0.138187 0.069801 -0.130952 -0.196098 0.870513 -0.477170 -0.463774 -0.161250 -0.105070 0.800406 -0.154935 -0.083657 0.844875 2.056617 0.733484 0.098587 1.247276 -0.074466 0.642256 -0.784845 0.034619 -0.114165 -0.255102 -0.240818 0.956592 -0.004753 0.621808 1.102205 -0.420340 -0.278171 -0.315857 0.577660 0.062387 3.628204 0.207531 -0.050652 -0.023388 -0.236505 -0.396663 -0.429295 1.831440 0.676773 -0.813973 0.749396
365 0.069039 0.156766 -0.246596 -0.198012 -0.203506 0.897700 -0.084117 -0.805525 -0.106195 0.242586 -0.946420 -0.435728 -0.120650 -0.017564 -0.130952 -0.437169 0.902786 -0.039231 0.775613 -0.273231 -0.166308 0.608782 0.165191 3.562285 -1.327523 -1.455306 0.992950 0.128036 0.663615 -0.183009 -0.378578 0.002069 -0.068259 -0.114165 -0.205232 -0.698026 0.074434 -0.028468 -0.018813 -0.114908 -0.398945 -0.255721 -0.126216 -0.069834 -0.367858 -0.104034 -0.020120 -0.059784 0.179756 -0.866699 -0.396663 -0.650607 1.243254 -0.505526 -0.820255 0.749396
366 -0.289003 0.007215 -0.246596 -0.398427 -0.203506 -0.628959 -0.074312 -0.008519 -0.064460 -0.373872 0.977670 -0.407906 -0.315252 -0.168303 -0.130952 0.169542 0.022473 -0.118762 -0.456611 -0.343220 -0.166308 0.171540 -0.237470 0.115310 1.040668 0.456201 -0.141417 -0.181093 0.067795 -0.671104 0.234924 -0.402131 -0.217917 -0.114165 -0.405699 0.367170 -0.714438 -0.151076 -0.427051 1.102205 0.342512 -0.304459 -0.291117 -0.379963 -0.240995 -0.662852 -0.480085 -0.051412 -0.054302 -0.539569 0.265877 1.012807 -0.232982 -0.860391 1.301265 -1.334408
367 -0.297528 -0.430109 -0.246596 -0.220732 -0.203506 -0.695192 -0.084117 -0.013030 -0.106195 -0.319719 0.268795 -0.500904 -0.951909 -0.062326 -0.130952 -0.317453 -0.149625 -0.175697 -0.712517 -0.581181 -0.166308 -0.439594 -0.400239 -0.879526 -1.327523 -0.046873 -0.166578 -0.181093 -0.382110 -0.668139 -0.460546 -1.021188 0.966342 -0.114165 -0.242511 -0.698026 -0.008712 -0.278488 2.367814 -1.344370 -0.420340 -0.304459 -0.184321 0.923073 -0.654943 -0.098483 0.277541 -0.062460 -0.239782 -0.366644 -0.396663 -0.614291 0.047141 -0.106671 -0.822000 0.749396
368 -0.536223 -0.403471 -0.246596 -0.436956 -0.203506 -0.582597 -0.084117 0.444121 -0.106195 -0.070191 -3.174315 -0.411613 -0.899240 -0.112486 -0.130952 -0.395186 -0.198312 -0.176440 0.286157 -0.973116 -0.166308 0.384603 -0.273155 0.013138 -1.327523 0.202483 -0.839912 -0.181093 0.955445 -0.141710 -0.485744 0.691402 -0.217917 -0.114165 -0.522720 -0.698026 -0.329128 -0.472092 -0.427051 1.102205 -0.420340 -0.304459 -0.367604 -0.379963 -0.141856 0.282698 -0.480085 -0.065843 0.475641 0.079929 -0.396663 0.530876 -0.295191 0.841126 1.301265 -1.334408
369 -0.536223 0.644072 -0.246596 -0.448834 -0.203506 -0.489871 -0.084117 0.185470 -0.106195 -0.014732 -0.035009 8.489001 -0.503089 -0.321829 -0.130952 -0.289312 -0.461926 -0.359583 -0.235534 -0.469199 -0.166308 0.344107 -1.506004 0.066913 0.285016 -0.024971 -0.431350 -0.181093 0.274508 -0.130697 1.426136 -0.313055 -0.150755 -0.114165 -1.094249 -0.542381 -0.720522 -0.194090 -0.056496 -0.669935 -0.109412 -0.162384 -0.345722 0.005067 0.123841 -0.392695 -0.067048 -0.065843 0.078184 -1.022688 -0.171428 -0.629519 0.217749 -0.197310 -0.807817 0.749396
370 -0.007684 0.707819 -0.246596 -0.068001 -0.203506 0.109533 -0.084117 -1.739376 -0.056584 -0.462174 -0.338813 0.203164 2.648425 -0.170482 -0.130952 -0.044699 -1.084909 1.059926 -0.529727 -0.217240 -0.166308 0.494972 0.028438 0.454093 0.635217 0.807994 -0.891926 -0.181093 -0.333472 -0.544580 -0.824424 2.059325 0.574242 -0.114165 -0.393108 -0.531437 0.173804 -0.002704 0.408268 -0.006259 -0.420340 -0.088309 -0.360803 0.872313 3.532399 1.124625 1.531241 -0.056155 0.064935 0.194024 -0.396663 -0.646281 1.107290 -0.877526 -0.827804 0.749396
371 -0.578847 -0.425062 -0.246596 -0.477946 -0.203506 -0.271304 -0.084117 -0.613040 -0.106195 -0.381235 0.471331 -0.504147 -0.621188 -0.161963 -0.130952 -0.193934 -0.099612 -0.343380 -0.206881 -1.225075 -0.166308 0.000295 -0.604217 0.368053 -1.327523 0.386036 -0.986013 -0.181093 -0.163238 0.125698 0.321974 0.079059 -0.096405 -0.114165 -0.777501 -0.034102 -0.197311 -0.315980 0.131922 1.102205 -0.420340 -0.304459 -0.210046 0.091730 -0.162220 -0.640647 -0.032337 -0.052254 -0.098464 0.303661 -0.250109 -0.188005 -0.197222 -0.270535 -0.811937 0.749396
372 0.265110 -0.895543 -0.184396 -0.057462 -0.203506 0.414203 -0.084117 -0.610033 0.242352 0.445649 -0.338813 -0.145417 -0.187955 0.241830 -0.130952 -0.418933 -0.062227 0.950070 -0.048051 -0.721158 0.078643 -0.258821 -0.343719 0.593908 -1.327523 0.926712 -0.379927 -0.181093 -1.111686 -0.530792 1.791088 -1.220826 0.272402 -0.114165 -0.534570 -0.555148 -0.010740 0.327871 1.413163 1.102205 -0.420340 -0.285760 -0.221283 0.474284 -0.405373 0.715689 0.004986 -0.032234 0.179756 0.628118 -0.047843 -0.501741 -1.327029 -0.905990 -0.815779 0.749396
373 -0.578847 -0.380254 -0.246596 -0.389314 -0.203506 -0.205071 -0.073700 -1.661179 -0.094155 -0.316781 0.977670 -0.506707 -0.528974 -0.210793 -0.130952 -0.120858 0.584448 -0.508536 -0.357188 -0.665167 -0.070624 -0.953063 0.820984 0.647682 0.733151 -1.441448 -0.005834 -0.181093 -0.528026 0.354369 0.529864 -0.462559 -0.217917 -0.114165 1.484666 -0.283378 -0.475140 -0.519415 -0.427051 1.102205 0.263087 0.352935 -0.405404 -0.379963 0.930608 -0.087381 -0.480085 -0.049859 -0.239782 -0.529764 6.876010 -0.104031 -0.086588 0.408981 1.301265 -1.334408
374 -0.399826 -0.328271 -0.246596 -0.290402 -0.203506 0.407580 -0.084117 1.089245 -0.106195 -0.355478 -0.541349 -0.155169 -0.187239 -0.195930 -0.130952 -0.348284 -0.254455 -0.188481 -0.213797 0.818590 -0.074451 -1.267761 -1.506004 0.120688 -1.327523 0.959371 0.167523 -0.181093 -0.382110 -0.215752 -0.143844 -1.109816 -0.217917 -0.114165 1.852272 0.092359 -0.414302 -0.017290 -0.427051 1.102205 -0.420340 -0.304459 -0.131686 -0.379963 -0.311407 -1.016277 -0.480085 -0.049776 -0.182371 -0.511937 -0.099346 0.948443 -0.820050 -0.064918 1.301265 -1.334408
375 -0.152606 0.037808 -0.246596 0.319353 -0.203506 0.361217 -0.058380 0.340360 -0.106195 -0.012177 0.471331 -0.332692 -0.719430 -0.200071 -0.130952 -0.288459 0.447830 0.650678 -0.110299 -0.077263 -0.166308 -0.120132 0.168414 1.389777 0.671914 -1.043039 -0.481846 -0.181093 0.469061 0.597273 0.506383 0.098754 -0.217917 -0.114165 -0.613079 0.567806 0.936313 0.144620 -0.427051 1.102205 -0.420340 0.224259 -0.296636 -0.379963 -0.409190 -0.651750 -0.480085 -0.051819 -0.239782 -0.466477 1.489418 1.540556 1.517045 0.972540 1.301265 -1.334408
376 -0.101457 0.281044 -0.246596 1.103651 1.430326 -0.446820 -0.079215 -0.058143 -0.106195 -0.013036 -0.136277 -0.435508 0.455445 -0.193374 -0.130952 -0.434676 -0.429097 -0.244970 -0.024709 -0.721158 -0.166308 -0.916273 -0.286278 -0.008372 0.805959 -1.830455 -0.228036 0.951770 1.174318 0.852101 -0.280289 -0.378407 0.476217 -0.114165 -0.675539 -0.411055 -1.316739 -0.625232 -0.282598 1.102205 -0.420340 -0.035804 -0.334042 -0.341584 0.044002 -1.069938 0.075891 -0.065843 -0.129377 -1.124304 -0.396663 -0.201414 -1.096821 1.736191 -0.831567 0.749396
377 0.051989 -0.455066 0.759406 -0.354111 -0.203506 0.897700 -0.039384 0.938866 -0.067403 0.178524 -0.035009 -0.551187 1.975237 -0.134063 -0.130952 -0.328539 -0.959971 -0.373706 1.041647 -0.385213 0.028887 -0.992235 0.348104 -0.922546 -1.327523 0.237662 -0.266643 -0.181093 0.736573 -0.674599 -2.887210 0.400450 -0.217723 -0.114165 -0.058091 -0.182452 -0.304792 -0.088380 -0.383087 1.102205 -0.226241 -0.190063 0.070669 -0.379963 -0.676468 0.273446 -0.468161 -0.047270 -0.018972 -0.770432 -0.396663 0.732678 -0.225160 -0.939630 -0.601852 0.749396
378 -0.280478 -0.838315 -0.246596 -0.278351 -0.203506 -0.814410 -0.084117 -0.333336 -0.106195 0.441735 1.686546 -0.291896 -0.221038 -0.115719 -0.130952 -0.131026 0.220276 -0.291796 -0.633719 -0.161250 0.143708 -0.038347 0.434783 -2.202390 -1.327523 0.191435 2.434242 -0.181093 -0.892814 -0.986462 -1.868166 0.606354 -0.207047 -0.114165 0.062140 -0.698026 -1.266040 -0.047855 -0.420771 -0.760388 0.241537 -0.304459 0.001279 -0.267921 -0.267680 0.130966 1.338231 -0.055835 0.714115 -0.731212 -0.396663 -0.428370 1.118465 -1.079436 -0.827727 0.749396
379 0.231011 0.643948 0.001828 0.000589 -0.203506 1.367951 -0.084117 0.328329 -0.033272 -0.013931 -0.845152 -0.129509 1.779936 -0.205730 -0.130952 -0.324669 -0.096890 0.355151 -0.295312 0.006723 -0.166308 -1.823842 -0.675241 -0.814996 0.672803 -0.639010 -0.856972 -0.181093 -1.233282 -0.087131 0.440666 1.418335 -0.203165 -0.114165 0.448755 1.869509 -0.907093 -0.024357 -0.370526 -0.500745 -0.076549 -0.065842 0.004532 -0.228922 -0.101903 -1.210568 0.042541 -0.048765 -0.239782 1.899203 0.476830 -0.781193 -0.341754 1.410910 -0.814511 0.749396
380 0.282159 0.626242 0.401513 -0.214901 0.403900 -0.589220 -0.084117 1.489252 -0.106195 -0.233397 -0.946420 -0.180101 -0.890339 0.178297 -0.130952 -0.388036 0.490790 -0.269201 -0.527010 -0.777149 0.002096 -0.924213 0.761932 1.303737 0.694645 -0.606157 0.772562 -0.181093 0.566338 0.771429 -0.300548 0.035193 -0.176960 -0.114165 -0.064757 -0.303441 1.212114 0.302925 -0.169547 -0.381911 -0.420340 -0.098839 -0.079052 -0.043835 -0.708253 0.786004 -0.072190 -0.064464 0.563965 -0.119736 -0.182809 -0.641629 1.204141 1.657441 -0.804369 0.749396
381 0.000841 0.862452 0.695706 -0.371561 -0.203506 0.642705 -0.084117 -0.234086 -0.011409 -0.177126 1.787814 -0.526431 -0.339813 -0.154719 -0.130952 -0.400500 -1.987364 -0.259687 -0.655457 -0.049268 -0.024696 -0.855133 -0.194879 0.217483 1.101986 -0.835355 -1.056928 -0.181093 -0.528026 -0.365319 -0.186438 0.348526 0.138467 -0.114165 -0.091173 0.068648 -0.892898 0.090444 -0.043935 -1.541178 -0.420340 -0.210523 -0.260019 0.039732 -0.321410 -1.602850 0.536269 -0.043122 -0.239782 -0.830153 -0.396663 -0.758316 0.294112 -0.403137 -0.827675 0.749396
382 -0.203755 -0.230594 -0.246596 -0.078238 -0.203506 0.026743 -0.084117 -0.354389 -0.106195 -0.431025 -0.237545 -0.736189 0.513100 -0.101923 -0.130952 -0.345726 -0.153211 -0.647529 0.649884 0.622622 -0.166308 -0.626454 -0.473220 -0.696691 -1.327523 -0.290997 -0.452332 -0.181093 -0.309153 -0.360363 -1.106191 -0.535969 0.014238 -0.114165 -0.393849 -0.698026 -0.945624 0.026129 0.295217 -0.256748 -0.241018 -0.296817 -0.343504 -0.065501 -0.522188 -0.002263 -0.155037 -0.046573 -0.213284 -0.003859 -0.171077 -0.550856 0.532888 0.476610 -0.818596 0.749396
383 -0.212280 -0.531846 -0.075753 -0.484252 -0.203506 1.808398 -0.084117 0.200508 -0.086550 -0.013169 -0.440081 -0.588364 -0.432072 -0.261821 -0.130952 0.434949 0.346894 -0.622257 -0.365957 -0.469199 -0.166308 -0.626057 -0.230103 -0.503101 1.051263 -1.024917 -0.797409 -0.181093 0.347465 -0.581495 -0.003032 -0.680997 -0.134256 -0.114165 0.077076 -0.698026 1.854974 3.367295 -0.182108 -0.488502 -0.420340 -0.179657 -0.259723 -0.150926 -0.379338 -0.556454 -0.047640 -0.065843 0.281329 0.055862 0.511739 -0.456785 0.113820 -0.994567 -0.819276 0.749396
384 0.043465 -0.493424 -0.017379 0.012640 -0.203506 -0.453443 -0.084117 0.009527 -0.106195 -0.205127 0.471331 -0.173636 0.454910 0.012020 -0.130952 0.425831 -1.114152 0.232659 0.113742 0.146700 -0.166308 3.346840 -0.148834 -0.664426 0.317628 1.236736 -0.583308 -0.181093 -0.698260 -0.374850 0.313670 0.477440 -0.193459 -0.114165 -0.467172 -0.531437 -0.181087 -0.069368 -0.219792 -0.351343 -0.012901 -0.167103 -0.111628 -0.283396 -0.404232 -0.244664 -0.355132 -0.041177 -0.239782 -1.153719 0.073733 -0.413292 1.097232 -1.279179 -0.794826 0.749396
385 0.333308 1.087519 -0.114318 3.195400 2.243245 -0.479936 -0.084117 0.141860 -0.106195 -0.139221 0.977670 -0.419313 1.151081 -0.321829 -0.130952 -0.420966 1.191064 -0.550457 -0.740306 -0.385213 -0.166308 -1.483470 -0.380555 -0.062147 -1.327523 -1.516555 -1.005330 1.542576 -0.868494 1.264755 -1.089869 -0.142064 0.021031 -0.114165 -0.711090 -0.214067 -0.982127 -0.437113 -0.420771 1.102205 -0.399560 1.570881 -0.131933 -0.379963 1.476864 -0.720214 1.305218 -0.060534 -0.200036 -0.443302 -0.136279 -0.744419 -0.230002 2.711893 -0.832300 0.749396
386 -0.587372 3.468038 -0.246596 -0.348842 -0.203506 -0.513052 -0.084117 0.520814 -0.106195 -0.161930 -0.338813 -0.291852 -0.755059 -0.006024 -0.130952 -0.243066 -0.037031 -0.224902 0.154252 -0.581181 -0.166308 2.180155 0.355932 -0.148187 -0.835535 -0.439370 -0.580473 -0.181093 0.712253 -0.111021 0.630301 0.280488 -0.167254 -0.114165 -1.095730 -0.116181 0.098770 -0.035591 -0.056496 -0.682749 -0.159283 -0.304459 -0.283823 -0.115642 -0.179333 0.023643 -0.231376 -0.065843 -0.080799 0.829566 -0.396663 -0.703552 0.414059 -0.249694 -0.799890 0.749396
387 0.043465 0.627629 -0.089333 -0.306686 -0.203506 2.123002 -0.084117 0.913302 -0.106195 0.682269 -0.440081 -0.378650 -0.664372 0.650413 -0.130952 -0.420933 3.280003 -0.003108 0.306412 0.146700 -0.061056 -0.429007 0.494757 0.454093 -1.327523 -0.305824 -0.592993 0.116556 0.785211 0.242734 1.678553 -0.029265 -0.166284 -0.114165 1.009915 -0.209203 0.827817 -0.442451 -0.307720 0.641077 -0.420340 -0.111622 -0.182891 -0.288348 -0.401562 -0.063325 -0.163162 -0.065843 0.095849 0.113355 1.020520 -0.781193 0.824559 0.522140 -0.821853 0.749396
388 -0.399826 -0.740828 -0.246596 -0.377090 -0.203506 0.202259 -0.084117 2.910329 -0.106195 0.447530 -1.250224 -0.318240 2.382034 -0.043736 -0.130952 0.477522 -0.987250 -0.455763 0.234778 -1.001112 0.013578 0.842490 -0.165871 -0.427817 0.623130 0.505529 -0.564031 -0.181093 1.782298 -0.353014 -0.082422 0.472069 -0.089611 1.165730 -0.556790 -0.178804 0.987011 -0.034748 0.169605 -0.832440 -0.420340 -0.283048 -0.381157 0.377717 -0.431040 0.554705 0.205115 -0.055835 -0.160290 0.314358 -0.005824 -0.271245 2.229647 -0.127862 -0.811720 0.749396
389 0.282159 0.353387 -0.172391 -0.133698 0.703036 0.914259 -0.084117 -1.417566 0.021998 -0.614120 -0.440081 -0.661129 1.167107 -0.233165 -0.130952 -0.437169 -0.652096 0.537997 -0.425858 -0.581181 -0.040005 -1.746557 -0.619872 0.669192 -1.327523 -1.750308 -0.939816 0.486080 -0.138918 0.901702 -0.738591 -0.964788 -0.217917 -0.114165 0.882031 0.064392 -0.937512 -0.479893 -0.427051 1.102205 -0.225471 -0.242727 -0.197528 -0.379963 0.003849 -0.179900 -0.480085 -0.050367 -0.239782 -1.154610 -0.396663 -0.396485 1.019006 -0.075898 1.301265 -1.334408
390 -0.203755 -0.051880 -0.092559 0.029053 -0.203506 0.039989 -0.044899 0.844128 -0.082204 0.046857 -1.250224 -0.193581 -0.591191 -0.203428 0.458947 -0.229487 0.608118 0.177805 -0.589998 0.426654 0.166672 -0.568755 1.246438 0.056158 -0.341583 -0.976751 -0.559893 -0.021537 -0.941452 0.069192 -0.553251 0.179326 -0.216752 -0.114165 -0.511364 -0.571564 -0.434581 -0.227067 -0.018813 -0.443739 1.307699 -0.299899 0.379279 0.072540 0.201615 1.082066 -0.420072 -0.065843 0.162092 0.505110 -0.396663 -0.718272 0.383141 0.246725 0.119026 0.749396
391 -0.331627 1.633562 -0.246596 -0.249800 -0.203506 0.212194 -0.084117 4.654721 -0.046580 -0.131684 0.977670 0.138717 0.691868 -0.180246 -0.130952 0.036576 0.472186 0.131425 -0.741911 0.006723 -0.166308 -0.118279 -0.626549 0.583153 0.731354 0.389040 0.239741 -0.181093 0.323146 0.002626 0.345598 0.115764 0.569389 -0.114165 -0.632582 -0.034710 -0.254094 0.091740 1.501091 -0.816214 -0.420340 -0.304459 0.085060 1.161395 -0.206124 -0.427852 0.580973 -0.045600 -0.208868 3.316471 -0.396663 -0.125861 -0.615545 2.868344 -0.821596 0.749396
392 -0.425400 -1.005752 -0.246596 -0.394281 0.022844 0.033366 0.015153 -1.739376 -0.081933 -0.165505 0.876402 0.803071 -0.975287 -0.269736 0.007083 -0.020691 -1.552208 0.182116 -0.710046 -0.329222 -0.166308 -0.607662 0.138140 0.142198 -1.327523 0.322849 -0.439571 -0.181093 -0.284834 -0.078702 -0.683540 0.106812 0.575213 -0.114165 -0.189431 -0.005527 -0.473112 -0.219171 0.646930 -0.798447 1.092127 -0.191612 -0.319602 2.040403 -0.329160 -0.784978 2.523266 -0.055182 -0.239782 0.180653 -0.396663 -0.529610 -0.812600 -0.137514 -0.826470 0.749396
393 -0.348677 -0.857576 -0.157460 1.021628 -0.203506 -0.635583 -0.084117 -0.516798 -0.106195 0.234167 -1.554028 -0.386372 0.546674 0.119525 -0.130952 -0.283736 0.840075 0.095748 -0.229112 0.538635 -0.085934 0.156453 -0.643240 -0.578386 0.731746 -0.256593 -0.650246 -0.181093 0.760892 -0.598248 0.748133 1.840887 -0.217723 -0.114165 -0.808361 -0.530221 -0.402134 -0.293615 -0.370526 -1.345214 -0.420340 -0.147488 -0.465726 -0.376249 -0.416830 0.156871 -0.471142 -0.065843 -0.098464 -0.726755 -0.396663 -0.544164 -0.446428 -0.785907 -0.544182 0.749396
394 -0.297528 -0.741674 0.009482 -0.365298 -0.203506 -0.065983 -0.084117 -0.103257 -0.106195 0.404031 -0.541349 -0.220499 0.302477 -0.220881 -0.130952 -0.198066 0.022473 -0.685287 -0.308650 0.202691 -0.166308 0.013793 -0.011621 0.045403 0.923292 -0.130121 -0.004652 -0.181093 0.639296 -0.094396 -0.373496 -0.065074 -0.217917 -0.114165 3.571055 -0.414703 0.433381 -0.428932 -0.427051 1.102205 0.426709 -0.294211 -0.392196 -0.379963 -0.350415 1.170885 -0.480085 -0.065843 0.104681 0.569288 -0.396663 -0.171811 2.114916 0.727058 1.301265 -1.334408
395 -0.280478 0.001800 -0.126098 -0.123720 1.060401 0.851338 -0.084117 -0.688230 -0.106195 0.265906 -0.338813 -0.327330 0.679007 -0.041704 -0.130952 -0.350646 -0.162501 0.707316 -0.751421 -0.385213 -0.105070 -0.150305 0.139176 3.390205 0.684796 0.856062 1.504183 0.293082 1.271595 0.288650 -0.564920 0.677973 -0.131344 -0.114165 -0.357310 0.599421 0.139329 0.542685 0.056555 0.079268 -0.134502 -0.209554 -0.262533 0.135680 -0.184012 -0.505569 -0.004260 -0.065843 -0.239782 -0.456672 -0.122236 -0.377708 -1.537867 0.677472 -0.807543 0.749396
396 0.350358 4.102326 -0.246596 1.794522 1.314724 0.278426 -0.084117 0.119303 -0.106195 0.119014 1.078938 -0.241228 1.309062 -0.321829 -0.130952 -0.416637 -0.495548 0.012798 0.559724 0.258681 -0.166308 -0.534612 -0.316092 -0.341777 0.803603 -1.893545 -0.840402 0.834641 0.298827 3.284293 0.320543 0.019974 0.612093 -0.114165 0.012517 -0.173940 -0.164864 -0.293699 -0.119302 1.102205 -0.420340 -0.152189 -0.330839 -0.165163 0.299155 1.028405 0.568627 -0.049210 -0.239782 0.127617 -0.270943 -0.356434 0.884532 1.338175 -0.830849 0.749396
397 -0.306053 0.405537 -0.246596 -0.439806 -0.203506 0.061515 -0.027741 -0.089723 -0.106195 -0.067346 1.180206 -0.588143 0.247582 -0.194951 -0.130952 2.102763 -0.399854 0.043644 -0.274068 -0.413208 -0.166308 0.056406 -0.249442 0.981087 -1.327523 -0.299525 -0.540240 -0.181093 -0.211876 -0.200969 1.334003 0.599192 -0.217723 -0.114165 -1.114617 0.744123 -0.270317 0.311259 0.031432 0.062119 -0.420340 -0.304459 -0.408114 0.155489 0.573000 1.125550 -0.440115 -0.052457 0.091433 -0.702688 1.192197 -0.453992 -0.780937 0.309284 1.301265 0.749396
398 -0.263429 -0.097548 -0.246596 -0.153178 -0.203506 -0.579285 -0.084117 -0.582965 -0.075324 0.313547 -0.237545 -0.477230 0.581594 -0.126838 -0.130952 0.302245 0.637207 0.059030 0.276277 0.230686 -0.166308 1.064816 -0.440988 0.174463 1.104448 1.278990 -0.550208 -0.181093 -0.357791 -0.186356 0.078363 0.394183 -0.132315 -0.114165 -1.048823 0.307587 -0.858422 0.676898 -0.295159 -0.361294 -0.420340 0.062587 -0.264110 -0.266063 -0.576365 0.439981 -0.036606 -0.054403 -0.239782 -0.367536 0.141830 -0.532697 -0.170402 0.083770 -0.825405 0.749396
399 0.384457 1.014514 -0.172316 -0.458552 -0.203506 0.351282 -0.084117 -0.273185 -0.106195 -0.390280 -0.237545 -0.465801 -0.189473 -0.034369 -0.130952 -0.329327 0.275046 -0.058110 -0.166371 0.286677 -0.166308 1.818874 -0.721056 -1.492561 -1.327523 -0.228003 -0.317327 -0.181093 -1.476474 -0.752919 -0.222447 0.600088 -0.051371 -0.114165 0.350003 -0.595276 1.544697 1.417398 -0.420771 -0.708297 -0.420340 -0.301800 -0.290969 -0.127403 -0.415642 -0.633246 0.021401 -0.065843 0.193005 1.555136 0.114923 -0.681475 -0.010224 -1.129861 -0.445624 0.749396
400 0.282159 1.338986 -0.246596 0.045337 -0.203506 0.728808 -0.084117 1.376468 -0.106195 0.109568 -0.237545 0.342606 -0.848175 0.269545 0.156336 -0.429559 -0.249035 -0.014852 -0.322483 0.006723 -0.166308 0.052701 0.182688 2.465276 -0.023440 1.184597 -0.467445 -0.181093 0.736573 -0.325841 1.359273 0.379859 0.038501 -0.114165 0.010048 0.165318 -0.394022 0.312712 0.358023 1.102205 -0.420340 -0.169163 0.774725 -0.083453 -0.514410 0.952539 -0.183364 -0.048402 0.117930 1.195917 0.355270 -0.352912 -0.575314 -0.489790 -0.818823 0.749396
401 -0.007684 0.944645 -0.096161 -0.344911 -0.203506 -0.171955 -0.084117 0.645628 0.020640 0.738710 0.572598 -0.026097 0.437222 -0.072968 0.198130 -0.324079 0.557726 -0.445357 -0.865418 -0.133254 -0.166308 0.105900 -0.213757 -0.062147 -1.327523 0.647312 -0.818808 -0.181093 -0.163238 0.243962 0.760160 0.398659 -0.070200 -0.114165 -0.773057 0.836537 -0.375771 0.336346 -0.131863 -0.719273 -0.420340 -0.304459 -0.348285 -0.114404 -0.285856 -0.102184 0.069305 -0.052951 0.237167 -0.105475 -0.396663 0.114559 0.910980 2.632374 -0.823487 0.749396
402 -0.246379 0.948859 -0.246596 -0.125189 0.384491 0.417515 -0.069410 0.677208 -0.081842 -0.178655 -0.035009 0.084551 0.079968 0.071215 -0.130952 -0.069561 -0.690676 0.263877 0.191675 -0.441204 -0.059142 -1.101810 0.038913 1.024107 0.481275 0.149763 0.267347 -0.181093 1.198637 0.486548 1.126543 1.903553 0.164672 -0.114165 -0.378542 0.428577 -0.339267 -0.114909 0.182166 -0.750788 0.197207 -0.170554 -0.165051 -0.029598 0.428215 0.347461 0.076142 -0.048905 -0.239782 -0.118845 0.204906 -0.444879 -1.172440 -0.558889 -0.825379 0.749396
403 0.026415 -0.397828 0.211537 0.126885 -0.203506 -0.506429 -0.084117 0.329833 -0.106195 0.059715 0.673866 0.575265 0.020483 -0.218278 -0.130952 -0.053293 -1.396169 -0.383665 -0.441790 0.762599 1.142648 -0.110868 0.702073 -0.933301 0.691039 0.558153 -0.179729 -0.181093 -0.065961 -0.604601 -0.646673 -0.791112 -0.164343 0.788455 -0.298059 0.267460 0.171776 -0.261017 -0.219792 -1.279280 -0.258565 -0.061757 -0.040069 -0.075405 0.040100 0.943287 0.050305 -0.057848 -0.186787 0.212742 -0.396663 0.607202 -0.533594 -0.158215 -0.815026 0.749396
404 -0.271953 -0.403815 -0.150482 0.027110 -0.203506 0.162519 -0.084117 -0.217545 -0.106195 0.549020 1.180206 0.232685 -0.753764 -0.161979 -0.130952 -0.203117 -1.149809 0.015474 0.392126 0.118704 -0.166308 0.401542 -0.660392 -0.094412 -1.327523 -0.403221 2.110821 -0.181093 -0.017322 -0.751839 -0.096095 0.058469 -0.217917 -0.114165 -0.680477 -0.698026 -0.625208 -0.237338 -0.427051 1.102205 -0.420340 -0.304459 0.681186 -0.379963 -0.504796 0.164273 -0.480085 -0.046786 0.003109 -0.791825 -0.396663 0.716168 0.539593 -0.516297 1.301265 -1.334408
405 0.657251 -0.418577 15.475526 -0.420801 17.801752 -0.589220 1.063002 1.444138 0.575509 -0.665095 -0.440081 3.389376 2.648425 -0.094560 1.319259 0.248455 -0.410695 -0.170494 -0.324706 13.612489 18.256679 -1.794463 14.675744 2.519051 1.521700 6.941794 2.017298 1.464379 2.536193 2.065623 -0.817766 0.160526 -0.217723 18.059934 -0.790339 14.791085 3.209644 0.681100 -0.131863 -1.694258 15.449485 0.048888 -0.001727 0.780698 -0.217207 1.405885 -0.312890 -0.043640 0.237167 -0.814109 0.395631 -0.403289 -1.833636 -1.171055 0.551552 0.749396
406 -0.408350 2.704795 -0.246596 0.162390 0.313988 0.602966 -0.051640 -0.146867 -0.106195 -0.174822 0.066259 -0.029054 2.648425 0.595640 -0.130952 -0.048438 -0.839922 -0.140019 1.026702 -0.133254 0.189636 -0.939829 -0.239427 -0.040637 -0.327568 -0.317066 -0.113167 0.550302 -0.455068 0.791571 0.614838 1.649306 -0.079905 -0.114165 -0.846381 -0.041398 -0.434581 -0.402519 -0.282598 1.102205 -0.420340 -0.271991 -0.412500 -0.206019 -0.160463 -0.535175 0.259592 -0.048542 -0.239782 0.129845 -0.396663 -0.191447 -1.245078 2.899117 -0.827694 0.749396
407 -0.369989 0.374791 -0.210356 -0.409010 -0.203506 -0.569350 -0.084117 -0.091979 -0.106195 0.337477 0.066259 -0.064179 -0.594233 -0.117921 -0.130952 -0.231062 -0.208466 -0.528753 -0.277156 0.272679 -0.070624 -0.011880 -0.464932 -0.298757 -0.346004 -0.617689 -0.335361 -0.181093 0.335306 -0.557584 -0.008651 -0.466140 -0.217917 -0.114165 2.582915 -0.223187 -0.733704 -0.245529 -0.427051 1.102205 0.024734 0.176155 -0.272931 -0.379963 -0.241519 -0.169723 -0.480085 -0.045232 -0.239782 -0.261018 0.178780 -0.682481 0.334529 -0.262807 1.301265 -1.334408
408 -0.587372 -0.123021 -0.246596 -0.446890 -0.203506 -1.026354 -0.084117 -0.013030 -0.106195 -0.608691 -0.845152 -0.353674 -0.109896 -0.102740 -0.130952 -0.030662 -0.215058 -0.485792 -0.865418 -0.441204 -0.166308 0.348077 -0.228722 -0.083657 0.267551 0.169339 0.071649 -0.181093 0.663615 0.034162 0.295058 -1.396292 -0.150755 -0.114165 -0.711831 -0.193395 -0.773249 -0.258779 -0.031374 -0.453917 -0.130500 -0.040822 -0.212707 0.028590 -0.227937 0.125414 -0.071125 -0.065843 -0.182371 -0.577006 -0.396663 -0.561574 0.845047 0.352051 -0.806170 0.749396
409 -0.306053 0.190691 -0.246596 -0.128731 0.492385 1.079840 -0.084117 0.083212 -0.086731 0.722337 0.775134 0.075726 -0.814489 -0.085553 -0.130952 -0.240508 0.972598 1.575314 2.924133 -0.329222 -0.166308 0.624133 -0.218707 -0.384797 -0.329415 -0.353505 0.070561 -0.181093 -0.236195 0.779625 0.821152 -1.211873 0.541243 -0.114165 -0.900448 -0.060246 -0.371715 -0.185557 1.626703 1.102205 -0.420340 -0.304459 -0.116852 1.622565 -0.348440 -0.202105 0.788518 -0.026915 -0.094047 -0.406756 -0.396663 -0.465847 -0.685948 0.938130 -0.820454 0.749396
410 -0.433925 0.560102 -0.246596 -0.357524 -0.203506 -0.549480 -0.084117 -0.360404 -0.034494 -0.091481 -1.250224 0.096002 -0.287565 0.273785 -0.130952 -0.340150 0.034462 -0.085909 -0.241957 -0.301227 -0.166308 -0.721472 -1.017699 0.475603 0.612013 0.660880 -0.300831 -0.181093 0.006997 0.541911 0.655428 0.827478 -0.217917 -0.114165 1.502195 0.282052 0.167720 0.363353 -0.427051 -0.727172 0.024965 -0.304459 -0.332514 -0.379963 0.042917 -0.233561 -0.480085 -0.065843 0.175340 -0.364862 -0.396663 0.877847 -1.129229 -0.542244 1.301265 -1.334408
411 0.060514 -0.886800 -0.246596 -0.390610 0.828343 0.573161 0.335635 0.537356 0.025302 1.464306 1.484010 0.568911 -0.285988 -0.242158 0.331325 0.398542 -0.722417 -1.250474 0.421150 0.790594 -0.166308 -0.027231 2.229264 -0.427817 0.765625 0.336804 -0.632595 0.381262 0.152912 0.518741 -0.374999 0.365535 -0.217917 -0.114165 -0.182766 2.634966 -1.067301 0.157437 -0.427051 -0.824148 3.434789 -0.023073 -0.207089 -0.379963 -0.269331 -2.200526 -0.480085 -0.047700 -0.239782 -0.942465 -0.396663 -0.142434 0.301190 -1.170076 1.301265 -1.334408
412 -0.655570 -0.150799 0.085637 -0.423393 0.327974 -0.774671 0.020055 -0.510783 -0.085916 -0.381329 -0.035009 -0.310695 -0.782887 -0.210519 -0.130952 -0.231127 0.002011 -0.654664 -0.518611 -1.169084 -0.089761 -0.177831 0.235410 -0.610651 -1.327523 -1.469261 -0.172207 -0.181093 0.663615 0.326985 -0.166465 0.196335 -0.217917 0.415237 0.342103 -0.166644 -0.852339 -0.671581 -0.427051 -0.448099 -0.420340 0.126643 -0.312358 -0.379963 0.589755 -0.087381 -0.480085 -0.053096 -0.239782 -1.160850 0.646562 -0.342314 -0.684086 0.478499 1.301265 -1.334408
413 0.265110 0.274512 -0.246596 1.004869 -0.203506 -0.403769 -0.035707 0.483219 -0.048300 0.262043 1.889081 -0.089552 2.648425 -0.185872 -0.130952 -0.152738 -0.364848 0.150602 -0.456364 0.258681 -0.047660 0.664628 0.932297 -1.331236 0.782848 0.381675 -0.975777 -0.075943 -0.868494 -0.395309 -0.271555 1.786277 1.577786 -0.114165 0.309761 0.044328 -0.969960 0.250707 2.217079 -1.119447 0.653595 -0.148861 -0.323200 2.654470 -0.258915 0.432579 1.818692 -0.056968 0.396150 1.187894 -0.396663 -0.345544 0.953446 -0.577283 -0.825959 0.749396
414 -0.595896 1.618871 -0.246596 -0.285435 0.347384 -0.569350 -0.084117 -0.382961 -0.106195 -0.381900 0.167527 -0.220631 0.393171 0.255708 -0.130952 -0.431068 -0.058357 -0.301162 -0.427587 -0.637172 -0.166308 1.608722 -0.524329 0.765987 -0.748569 -0.696382 -0.459452 -0.181093 2.998258 0.162020 0.775122 0.985935 -0.155608 -0.114165 -0.867119 -0.152660 -0.138500 -0.237087 -0.119302 -0.770612 -0.146893 -0.304459 -0.359226 -0.184353 -0.236964 -0.396396 -0.226153 -0.060374 -0.239782 -1.228593 -0.396663 -0.343041 -0.439723 0.063418 -0.810352 0.749396
415 -0.280478 -0.612048 -0.246596 -0.334502 -0.203506 -0.552792 -0.032644 -0.623567 0.013216 -0.331777 0.572598 -0.249446 0.239837 -0.172918 -0.130952 -0.340478 -0.488602 0.087423 -0.861218 -0.273231 -0.166308 0.564581 -0.787475 0.733722 0.845211 0.280013 0.009520 0.299072 0.177231 -0.075419 0.887657 -0.144750 -0.156578 -0.114165 -0.407180 0.671164 -0.112137 0.022316 -0.282598 -1.244121 -0.420340 0.054171 -0.208666 -0.128641 -0.267397 -0.148443 0.200315 -0.056067 0.193005 -0.716059 -0.396663 -0.671808 0.183106 -0.696946 -0.821720 0.749396
416 0.401507 0.274799 -0.246596 -0.029387 0.638528 -0.872364 -0.068185 0.499761 -0.057625 0.283902 0.167527 0.161597 -0.485373 19.945725 0.839274 -0.261171 1.079819 0.786995 -0.697202 0.678613 -0.166308 1.639425 -1.084809 0.120688 -1.327523 -0.142914 2.669568 -0.181093 -1.379198 0.563259 0.314887 1.679744 -0.217723 -0.114165 0.792043 -0.305873 -0.606957 -0.837864 -0.194669 -1.420800 -0.420340 0.011103 -0.345328 -0.240684 -0.182311 -0.767399 -0.463635 -0.015127 0.025190 -0.719624 -0.396663 -0.719406 4.911682 0.968623 0.263201 0.749396
417 -0.672620 0.982976 -0.246596 -0.390005 -0.203506 -0.883955 -0.084117 0.639613 -0.106195 -0.278600 0.673866 -0.182219 -0.489316 -0.073526 -0.130952 -0.325063 -1.319564 -0.084274 0.059646 -0.525190 -0.166308 0.184244 -0.045579 -0.105167 -0.784656 0.706429 -0.726010 -0.181093 0.542019 -0.360406 0.664161 0.747802 -0.126492 -0.114165 -0.954514 -0.698026 -1.057161 -0.129781 0.031432 -0.984837 0.251773 0.012775 -0.311668 0.097920 -0.195941 -0.011514 -0.000258 -0.050139 -0.080799 0.991794 1.784680 -0.732673 -1.015243 -0.207801 -0.810072 0.749396
418 0.137238 -0.964837 0.234046 0.485732 -0.203506 -0.234876 -0.014260 -1.739376 -0.106195 0.421173 1.281474 1.284960 0.389256 -0.220210 -0.130952 0.329337 0.186145 -0.185211 -0.185391 -0.049268 -0.101243 0.918451 0.390811 -1.546335 0.752183 0.587711 -0.714914 -0.181093 -0.576664 0.187202 -0.332118 -0.822445 -0.148814 -0.114165 -0.569628 -0.214067 0.346179 -0.131689 0.144483 -0.903613 -0.420340 0.024784 5.845248 0.177154 -0.443666 -0.250215 -0.094962 -0.054800 0.250415 0.025556 -0.396663 0.071740 -0.672538 0.245046 -0.795688 0.749396
419 -0.374251 0.706765 -0.246596 -0.266516 -0.203506 -0.330913 -0.084117 -0.401007 0.015117 -0.453787 0.167527 -0.228574 -0.582347 -0.199349 -0.130952 -0.437169 -0.950149 -0.708329 -0.389424 -0.413208 -0.166308 -0.529848 -0.140776 -0.653671 0.622110 0.507758 -0.500867 -0.181093 -0.065961 -0.986462 0.006633 -1.123245 -0.078935 -0.114165 -0.403230 -0.698026 0.090658 -0.255644 0.270095 -0.992517 -0.420340 -0.304459 -0.248684 0.302815 -0.410566 0.005139 0.069765 -0.054480 -0.239782 0.073690 -0.396663 -0.587488 -1.189947 -0.335438 -0.810106 0.749396
420 0.137238 -0.568382 -0.246596 0.286354 -0.203506 -0.069294 -0.084117 1.403536 -0.106195 -0.446077 0.268795 -0.182792 -0.272779 -0.017280 -0.130952 0.047924 -0.386647 1.162498 1.309038 -0.189245 0.166672 0.138720 -0.210074 -0.546121 0.749161 -0.782150 0.263330 -0.181093 -1.208963 -0.619046 -0.979410 -0.406160 -0.217917 -0.114165 -0.995003 -0.013431 2.619510 0.067583 -0.427051 -1.172046 -0.420340 -0.281745 -0.319257 -0.379963 -0.565917 -0.392695 -0.480085 -0.062620 -0.217701 0.205611 -0.396663 0.729023 0.157403 -0.890534 1.301265 -1.334408
421 -0.536223 -0.161130 -0.246596 -0.243753 -0.203506 -1.821145 -0.084117 -0.208522 -0.050473 -0.390289 0.370063 -0.270296 0.573178 -0.048171 -0.130952 -0.189342 -0.492164 -1.250474 -0.512807 -0.497194 -0.166308 -2.397656 -1.506004 -0.158942 0.644226 0.894828 -0.355035 -0.181093 -0.771218 -0.660048 -0.389388 0.419250 0.528626 -0.114165 -0.170175 -0.290673 -0.533951 -0.167572 1.526214 -1.007216 -0.420340 -0.304459 -0.453110 0.659371 -0.098635 -2.200526 0.208456 -0.045126 -0.222117 -0.933552 -0.396663 -0.673099 -0.934781 0.495983 -0.820849 0.749396
422 -0.519173 1.129397 -0.246596 -0.221293 -0.203506 0.318166 -0.084117 -1.146884 -0.063283 -0.468859 -1.047688 -0.204503 -0.160454 -0.163123 -0.130952 0.044251 0.158334 -0.731073 0.990391 -0.777149 -0.166308 -1.418361 -0.214908 -0.137432 -1.327523 2.404534 -0.129744 -0.181093 0.858169 -0.895222 0.372801 -0.887797 -0.217917 -0.114165 -0.263989 1.191602 -0.014796 -0.267239 -0.427051 1.102205 0.864318 -0.304459 -0.353804 -0.379963 -0.171528 0.489941 -0.480085 -0.042091 -0.239782 -0.948705 0.896059 0.145730 -1.368750 -0.903053 1.301265 -1.334408
423 -0.067358 0.389726 -0.009726 1.211029 -0.203506 -0.264680 -0.084117 0.666681 -0.106195 -0.177778 1.686546 0.119720 0.089573 -0.065416 -0.130952 -0.217549 0.336586 -0.034326 -0.571719 0.370663 -0.139517 -0.367338 -0.693774 -0.266492 -1.327523 -1.349865 4.701673 -0.181093 -0.187557 -0.666783 -1.351163 -0.472407 -0.157549 -0.114165 -0.058338 -0.698026 1.171555 -0.250092 -0.357965 -0.598033 -0.420340 -0.304459 -0.287223 -0.233255 -0.425420 -0.153995 0.373832 -0.055051 0.188589 1.047950 -0.396663 -0.467151 0.202848 -0.908228 -0.827041 0.749396
424 0.503805 -1.162642 0.080985 -0.047441 -0.203506 -0.364029 -0.084117 1.196014 -0.106195 0.566091 0.775134 -0.134782 -0.121444 -0.321829 -0.130952 0.266495 -0.622521 0.662570 0.008514 -0.217240 0.139881 -0.371043 -0.711271 0.948822 0.667300 1.184403 -1.117190 -0.181093 0.444742 0.313388 0.542606 0.475649 -0.217917 -0.114165 0.467765 0.333123 -1.016602 0.347873 -0.427051 1.102205 0.024965 0.010962 -0.418809 -0.379963 1.674138 -0.198404 -0.480085 -0.051954 -0.239782 1.975860 -0.396663 0.360671 -1.302816 -0.703381 1.301265 -1.334408
425 -0.672620 -0.282543 0.071531 -0.412033 -0.203506 -0.397145 -0.084117 0.272689 -0.085373 -0.080012 -1.351492 -0.265553 -0.887720 -0.187734 -0.130952 -0.393612 -0.246005 -0.730775 -0.751298 -0.777149 -0.043833 -0.057933 -0.084832 -1.739925 0.643449 -0.085735 0.622028 -0.181093 -0.746898 -0.986462 -1.443081 -1.902101 -0.217917 -0.114165 0.388516 -0.698026 0.364431 -0.447597 -0.427051 -1.156155 0.021271 -0.304459 -0.340646 -0.379963 0.144613 -0.313128 -0.480085 -0.056493 -0.098464 -0.166087 -0.396663 -0.118662 0.582058 -1.100278 1.301265 -1.334408
426 -0.007684 -0.278495 -0.246596 -0.227556 -0.203506 -0.155397 0.002898 -0.170927 -0.014894 0.025040 -0.136277 0.733483 -0.986384 0.028818 -0.130952 -0.271010 0.063066 -0.340853 0.364337 0.174695 0.109262 0.324521 1.294440 -0.051392 -1.327523 -0.524750 0.390786 -0.181093 0.250189 -0.359220 0.092752 0.110393 -0.217723 -0.114165 0.733162 -0.424431 3.907258 -0.296632 -0.357965 -1.089377 2.081711 -0.126341 -0.176731 -0.366964 0.999988 1.202342 -0.469689 -0.061560 0.020774 0.527394 -0.396663 -0.683136 0.728080 0.272952 -0.486512 0.749396
427 -0.297528 0.898728 0.255505 -0.340376 -0.203506 -0.185201 -0.084117 1.099771 -0.106195 0.663483 1.078938 -0.037394 -0.751722 -0.141891 -0.130952 -0.355369 0.912100 -0.010243 -0.750310 0.846585 -0.085934 0.361576 2.928799 -0.126677 0.691039 1.610043 0.281855 -0.181093 0.347465 -0.564488 -0.513520 -1.327359 -0.217917 -0.114165 1.371472 -0.432943 1.147219 -0.206511 -0.427051 0.122548 -0.185143 -0.077815 -0.133608 -0.379963 -0.314807 -0.388069 -0.480085 -0.055796 -0.239782 0.322380 0.076666 -0.470636 -0.302268 -0.161327 1.301265 -1.334408
428 -0.450975 1.541316 -0.246596 -0.461360 -0.203506 -0.864085 -0.084117 -1.739376 -0.106195 -0.941030 -0.035009 -0.622253 0.021637 -0.189527 -0.130952 -0.411258 1.408819 -0.859065 -0.513671 -0.105259 -0.166308 -0.414715 -0.876342 -1.825965 0.754944 -1.041876 -0.165544 -0.181093 -1.087367 -0.986462 -2.887210 -0.393626 -0.217917 -0.114165 -0.332869 -0.698026 -0.282485 -0.320256 -0.427051 1.102205 0.999464 -0.304459 -0.159088 -0.379963 -0.628484 -0.596238 -0.480085 -0.055680 -0.239782 -1.214331 -0.396663 0.013568 0.521340 -1.378560 1.301265 -1.334408
429 0.000841 -0.187329 -0.246596 -0.122597 -0.203506 -0.850838 -0.084117 0.493746 -0.106195 0.092288 0.572598 -0.234134 -0.381526 -0.037638 -0.130952 -0.149590 0.260845 -0.127830 0.166109 0.034718 0.074816 0.619634 0.363644 -1.922760 1.128647 1.251466 -0.861889 -0.181093 -0.917133 -0.564785 -1.340281 2.087077 -0.089611 -0.114165 -0.388170 0.578750 -0.501503 0.372585 0.307778 -1.005921 -0.420340 0.056741 -0.239912 0.226676 -0.278742 0.447382 -0.035128 -0.052119 -0.063134 0.661990 -0.396663 0.251686 -0.984697 -1.132309 -0.806950 0.749396
430 -0.595896 1.810744 -0.246596 -0.269712 -0.203506 0.003561 -0.084117 0.337352 -0.106195 -0.104924 0.977670 -0.288146 -0.457545 -0.222606 -0.130952 -0.086682 -0.275580 -0.046367 -0.334340 -0.553185 -0.166308 0.039467 -0.617340 0.701457 0.367835 1.704243 -0.270472 -0.181093 -0.260514 0.106679 0.161118 1.626925 -0.195400 -0.114165 -0.342744 0.886392 -1.069329 -0.081964 -0.131863 1.102205 -0.201921 -0.078273 -0.348827 -0.043216 -0.436824 1.764860 -0.218773 -0.050086 0.135594 -0.122411 0.095189 -0.765098 -0.571589 1.708076 -0.774863 0.749396
In [60]:
# importing libraries to balance the target variable by SMOTE:

import os

os.environ['THREADPOOLCTL_USE'] = '0'

# Now run your code with SMOTE
from imblearn.over_sampling import SMOTE

# Create the SMOTE object
smote = SMOTE(random_state=42)

# Apply SMOTE on the scaled training set
x_train_resampled, y_train_resampled = smote.fit_resample(x_train_scaled, y_train)
In [61]:
# Balancing the class of train dataset using oversampling method: SMOTE

from imblearn.over_sampling import SMOTE

# Create the SMOTE object
smote = SMOTE(random_state=42)

# Apply SMOTE on the scaled training set
x_train_resampled, y_train_resampled = smote.fit_resample(x_train_scaled, y_train)
In [62]:
# Checking the resampled input variable of training set

x_train_resampled.head(8)
Out[62]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL EJ
0 -0.578847 -0.813207 -0.246596 -0.485677 -0.189234 -0.950187 -0.084117 -0.250628 -0.106195 -0.611691 -0.946420 -0.725135 -0.991933 -0.231473 -0.130952 -0.437169 -0.551490 -0.739397 -0.221702 -0.917126 -0.166308 -0.656098 -0.209843 0.185218 -1.327523 -0.165398 -0.254580 -0.166785 0.736573 0.337447 -0.440000 -0.712331 -0.217917 -0.114165 -0.169187 -0.514413 -1.085553 -0.357101 -0.427051 -0.784791 -0.420340 -0.068096 -0.216354 -0.379963 3.917970 -0.002263 -0.480085 -0.065843 -0.116128 -0.501240 -0.254094 0.402761 -0.202064 0.897986 1.301265 -1.334408
1 0.154287 1.373691 -0.246596 -0.197105 -0.203506 -0.215006 -0.084117 -1.158915 -0.106195 -0.264544 0.268795 -0.347287 -1.016493 0.852010 -0.130952 -0.425427 0.131706 -0.641582 -0.404985 0.398658 -0.166308 -0.722531 0.164961 -0.008372 0.748316 0.751784 -0.809338 -0.181093 -1.038729 0.248368 -0.634861 -2.002368 -0.217917 -0.114165 3.881630 -0.369712 0.903865 -0.123023 -0.427051 1.102205 0.009418 -0.281041 0.050020 -0.379963 2.328843 1.330019 -0.480085 -0.052777 -0.173539 0.961042 -0.396663 -0.755450 0.118290 1.294184 1.301265 -1.334408
2 -0.527698 0.103716 -0.246596 -0.426589 -0.203506 0.612901 -0.038771 -1.739376 -0.106195 0.149485 0.876402 -0.211431 1.184260 -0.170499 0.260843 -0.228503 -1.510491 -0.390058 -0.133024 -0.637172 -0.166308 -0.273908 -0.609973 0.443338 -1.327523 0.507370 -0.895136 -0.181093 -0.211876 -0.391073 0.548620 -0.390940 0.011326 -0.114165 -0.060066 0.244356 -0.799612 -0.207139 0.445951 1.102205 0.192743 -0.304459 -0.412353 0.190773 4.684170 -0.672104 -0.006608 -0.045489 -0.054302 -1.012883 0.028176 -0.669144 -1.294994 -0.831437 -0.815500 0.749396
3 -0.433925 0.772965 -0.246596 -0.401710 -0.203506 -1.076029 -0.084117 -1.739376 -0.090624 0.796538 0.775134 0.620209 -0.997453 0.094638 0.034819 -0.218073 -0.037031 -0.960002 -0.677441 -1.253070 -0.166308 -0.856721 1.693993 -1.438786 -0.428232 2.004963 -0.855011 -0.181093 -1.306240 -0.114855 -0.594843 -1.648749 0.220964 -0.114165 -0.194122 0.775130 0.350235 0.185266 0.533880 -1.404094 -0.420340 0.044610 -0.313442 0.448904 2.767541 -0.986670 0.383630 -0.061318 -0.239782 -0.259681 -0.396663 -0.629923 -1.372102 -0.400550 -0.822730 0.749396
4 -0.050308 -0.519493 -0.183946 0.029572 -0.203506 0.616212 -0.084117 -1.046131 -0.106195 -0.942073 -0.440081 -0.635050 -0.750640 0.887792 -0.130952 -0.382460 -0.512117 0.115519 -0.053980 0.174695 -0.166308 0.421128 -1.506004 -0.793486 -1.327523 -1.299858 0.858791 -0.181093 -0.673941 -0.616229 -0.127236 -0.202941 -0.125521 -0.114165 0.563308 -0.698026 0.565198 -0.837864 -0.295159 -0.791123 -0.420340 -0.230367 -0.399638 -0.328584 -0.578600 -1.069938 -0.225156 -0.065843 -0.239782 -0.908594 -0.396663 -0.740521 2.121993 -1.108181 -0.825925 0.749396
5 -0.493599 -0.372892 -0.148156 0.086845 -0.203506 1.656063 -0.084117 1.845649 -0.106195 0.752954 -0.136277 0.179512 -0.593107 0.061663 0.260671 -0.260777 0.131682 0.316055 -0.637178 -0.581181 0.105434 -1.425242 0.694937 1.841486 -1.327523 0.821368 0.890185 -0.181093 0.006997 0.614259 -0.503212 -0.259341 -0.217917 -0.114165 -0.810830 -0.030454 -0.300736 0.027032 -0.427051 1.102205 -0.420340 -0.079224 0.597405 -0.379963 -0.260700 -0.248364 -0.480085 -0.057171 -0.217701 1.699537 -0.396663 1.727410 -1.140032 0.197489 1.301265 -1.334408
6 0.103138 0.262438 -0.246596 -0.201165 -0.203506 -0.099099 -0.084117 0.994506 -0.050880 -0.173295 0.775134 -0.261449 0.859735 0.309910 -0.130952 -0.015509 -0.303474 -1.250474 4.781053 -0.133254 -0.166308 0.033644 -0.088515 -0.535366 -1.327523 -0.062283 -0.361939 -0.181093 -0.138918 -0.257560 0.213806 -0.414217 -0.058553 -0.114165 -1.400381 0.081415 -1.026742 -0.041610 0.282656 -0.021718 -0.420340 -0.304459 -0.393724 0.152394 -0.153704 -2.200526 -0.026547 -0.024844 0.069352 -0.134890 -0.396663 -0.306927 1.320363 1.629745 -0.812611 0.749396
7 0.213961 0.080134 0.081135 -0.274809 -0.203506 -0.218318 -0.084117 1.597525 -0.106195 0.534563 0.167527 0.025421 0.224064 0.008408 -0.130952 0.267216 0.327473 -0.066286 -0.065095 -0.385213 -0.017041 0.237179 0.153105 -0.503101 0.720467 0.971292 0.302690 -0.181093 -0.892814 -0.144993 0.080797 -0.690845 0.181171 -0.114165 0.670454 -0.067542 -0.073606 0.399997 0.157044 -1.166922 -0.420340 -0.190574 1.199544 0.229771 -0.530029 -0.551829 0.528207 -0.036521 0.268080 0.367840 -0.233196 -0.529183 0.795876 -0.600082 -0.825954 0.749396
In [63]:
# Checking the resampled target variable of training set

y_train_resampled = pd.DataFrame({'Class': y_train_resampled})
y_train_resampled.head(8)
Out[63]:
Class
0 0.0
1 0.0
2 0.0
3 1.0
4 0.0
5 0.0
6 0.0
7 1.0
In [64]:
# Print the shapes of the resulting datasets

print("x_train shape of over-sampled data:", x_train_resampled.shape)
print("y_train shape of over-sampled data:", y_train_resampled.shape)
x_train shape of over-sampled data: (712, 56)
y_train shape of over-sampled data: (712, 1)
In [65]:
# Renaming the balanced dataframes

x_train_processed = x_train_resampled
x_valid_processed = x_valid_scaled
y_train_processed = y_train_resampled
y_valid_processed = y_valid
In [66]:
# checking the distribution of 'y_train_resampled' variable(output variable)
class_distribution = y_train_processed['Class'].value_counts()

# Plotting the pie chart
plt.figure(figsize=(8, 6))
plt.pie(class_distribution, labels=class_distribution.index, autopct='%1.1f%%')
plt.title('Distribution of Output Class')
plt.show()

Feature selection: Training set¶

(A) Pearson correlation matrix¶

In [67]:
# Concatenate x_train_resampled and y_train_resampled into a single DataFrame

concatenate_df = pd.concat([x_train_processed, y_train_processed], axis=1)
In [68]:
# Plotting the correlation between each feature & target variable 'class'

# Calculate correlation with Class
corr_with_class = concatenate_df.corr()['Class'].drop('Class')

plt.figure(figsize=(15, 7))
corr_with_class.plot(kind='bar')
plt.xlabel('Features')
plt.ylabel('Correlation with Class')
plt.title('Correlation of Features with Class')
plt.show()
In [69]:
# Select features with correlation >= 0.1 or <= -0.1
selected_features = corr_with_class[(corr_with_class >= 0.1) | (corr_with_class <= -0.1)]

# Plot the selected features
plt.figure(figsize=(15, 7))
selected_features.plot(kind='bar', color=np.where(selected_features >= 0, 'green', 'red'))
plt.xlabel('Features')
plt.ylabel('Correlation with Class')
plt.title('Features with Correlation >= 0.1 or <= -0.1')
plt.show()
In [70]:
# Printing the selected features

selected_columns = selected_features.index.tolist()
print("Selected Columns:")
print(selected_columns)
Selected Columns:
['AB', 'AF', 'AM', 'AR', 'AX', 'BC', 'BD ', 'BN', 'BP', 'BQ', 'BZ', 'CD ', 'CR', 'CU', 'CW ', 'DA', 'DE', 'DH', 'DI', 'DL', 'DU', 'EB', 'EE', 'EH', 'FD ', 'FE', 'FI', 'FL', 'GB', 'GF', 'GI', 'GL', 'EJ']
In [71]:
len(selected_columns)
Out[71]:
33

(B) K-Best Method¶

In [72]:
from sklearn.feature_selection import SelectKBest, f_classif

# Number of top features to select
k = 25

# Create the feature selector
selector = SelectKBest(score_func=f_classif, k=k)

# Perform feature selection
x_train_selected = selector.fit_transform(x_train_processed, y_train_processed)

# Get the indices of the selected features
selected_feature_indices = selector.get_support(indices=True)

# Get the column names of the selected features
selected_feature_names = [x_train_processed.columns[idx] for idx in selected_feature_indices]

# Print the column names of the selected features
print("Selected column names:")
for name in selected_feature_names:
    print(name)
Selected column names:
AB
AF
AM
BC
BN
BQ
CD 
CR
CU
CW 
DA
DE
DH
DI
DL
DU
EB
EE
EH
FD 
FE
FL
GF
GL
EJ

Feature Selection: Top 20 in both¶

In [73]:
# Convert the selected columns lists to sets

Pearson_corr = set(selected_columns)
Kbest = set(selected_feature_names)

# Find the common elements
common_elements = Pearson_corr.intersection(Kbest)

# Print the common elements
print("Common elements:", common_elements)
Common elements: {'CW ', 'DL', 'FL', 'EE', 'AB', 'DI', 'DA', 'CU', 'EB', 'AF', 'GF', 'AM', 'DU', 'EH', 'DH', 'CR', 'BQ', 'FE', 'BN', 'FD ', 'EJ', 'BC', 'DE', 'GL', 'CD '}
In [74]:
# List of columns which are most related to target variable

common_columns = sorted(list(common_elements))
print("Common columns:", common_elements)
Common columns: {'CW ', 'DL', 'FL', 'EE', 'AB', 'DI', 'DA', 'CU', 'EB', 'AF', 'GF', 'AM', 'DU', 'EH', 'DH', 'CR', 'BQ', 'FE', 'BN', 'FD ', 'EJ', 'BC', 'DE', 'GL', 'CD '}
In [75]:
len(common_columns)
Out[75]:
25
In [76]:
if isinstance(x_train_processed, pd.DataFrame):
    print("yes")
else:
    print("no")
yes
In [77]:
# Filter the DataFrame columns
x_train_filtered = x_train_processed[common_columns]

# Print the filtered DataFrame
x_train_filtered.head()
Out[77]:
AB AF AM BC BN BQ CD CR CU CW DA DE DH DI DL DU EB EE EH EJ FD FE FL GF GL
0 -0.578847 -0.813207 -0.485677 -0.106195 -0.946420 -0.991933 -0.739397 -0.656098 0.185218 -1.327523 -0.165398 -0.254580 0.736573 0.337447 -0.440000 -0.217917 -0.514413 -1.085553 -0.427051 -1.334408 -0.379963 3.917970 -0.480085 0.402761 1.301265
1 0.154287 1.373691 -0.197105 -0.106195 0.268795 -1.016493 -0.641582 -0.722531 -0.008372 0.748316 0.751784 -0.809338 -1.038729 0.248368 -0.634861 -0.217917 -0.369712 0.903865 -0.427051 -1.334408 -0.379963 2.328843 -0.480085 -0.755450 1.301265
2 -0.527698 0.103716 -0.426589 -0.106195 0.876402 1.184260 -0.390058 -0.273908 0.443338 -1.327523 0.507370 -0.895136 -0.211876 -0.391073 0.548620 0.011326 0.244356 -0.799612 0.445951 0.749396 0.190773 4.684170 -0.006608 -0.669144 -0.815500
3 -0.433925 0.772965 -0.401710 -0.090624 0.775134 -0.997453 -0.960002 -0.856721 -1.438786 -0.428232 2.004963 -0.855011 -1.306240 -0.114855 -0.594843 0.220964 0.775130 0.350235 0.533880 0.749396 0.448904 2.767541 0.383630 -0.629923 -0.822730
4 -0.050308 -0.519493 0.029572 -0.106195 -0.440081 -0.750640 0.115519 0.421128 -0.793486 -1.327523 -1.299858 0.858791 -0.673941 -0.616229 -0.127236 -0.125521 -0.698026 0.565198 -0.295159 0.749396 -0.328584 -0.578600 -0.225156 -0.740521 -0.825925
In [78]:
#Checking Shape of training & vValidation sets:

print('Shape of training set with all features :', x_train_processed.shape)
print('Shape of training set with best features :', x_train_filtered.shape)
print('Shape of training set (target):', y_train_processed.shape)
print('Shape of validation set:', x_valid_processed.shape)
print('Shape of validation set (target) :', y_valid_processed.shape)
Shape of training set with all features : (712, 56)
Shape of training set with best features : (712, 25)
Shape of training set (target): (712, 1)
Shape of validation set: (186, 56)
Shape of validation set (target) : (186,)

Building models, Hypertuning & Evaluating¶

In [79]:
# importing evaluating parameters from sklearn library.

from sklearn.metrics import precision_score, recall_score, accuracy_score, f1_score, classification_report

(A) Support Vector Machines¶

In [80]:
# importing required libraries for SVM:

from sklearn.svm import SVC
from sklearn.model_selection import GridSearchCV, KFold
In [81]:
# Hyperparameter Tuning using GridSearchCV with k-fold Cross-Validation:

# Create the SVM model
model_SVM = SVC()

# Define the hyperparameter grid
param_grid_SVM = {
    'C': [0.1, 1, 10],                                  # Regularization parameter
    'kernel': ['linear', 'poly', 'rbf', 'sigmoid'],     # Kernel type (linear or radial basis function)
    'gamma': [0.001, 0.01, 0.1],                        # Kernel coefficient for 'rbf'
    'degree': [2, 3, 4],
    'coef0': [0.0, 0.5],
}

# Create k-fold cross-validation object (here, k=5)
kfold = KFold(n_splits=5, shuffle=True, random_state=42)

# Create the GridSearchCV object with k-fold cross-validation
grid_search_SVM = GridSearchCV(model_SVM, param_grid_SVM, cv=kfold)
In [82]:
# Perform grid search with k-fold cross-validation to find the best hyperparameters

grid_search_SVM.fit(x_train_filtered, y_train_processed)
Out[82]:
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=SVC(),
             param_grid={'C': [0.1, 1, 10], 'coef0': [0.0, 0.5],
                         'degree': [2, 3, 4], 'gamma': [0.001, 0.01, 0.1],
                         'kernel': ['linear', 'poly', 'rbf', 'sigmoid']})
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=SVC(),
             param_grid={'C': [0.1, 1, 10], 'coef0': [0.0, 0.5],
                         'degree': [2, 3, 4], 'gamma': [0.001, 0.01, 0.1],
                         'kernel': ['linear', 'poly', 'rbf', 'sigmoid']})
SVC()
SVC()
In [83]:
# Retrieve the best model with optimized hyperparameters

best_model_SVM = grid_search_SVM.best_estimator_
best_params_SVM = grid_search_SVM.best_params_
print(best_params_SVM)
{'C': 10, 'coef0': 0.0, 'degree': 2, 'gamma': 0.1, 'kernel': 'rbf'}
In [84]:
# Train the best model on the full training set
best_model_SVM.fit(x_train_filtered, y_train_processed)
Out[84]:
SVC(C=10, degree=2, gamma=0.1)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
SVC(C=10, degree=2, gamma=0.1)
In [85]:
# Assuming x_train_filtered.columns and x_valid_processed.columns are not in the same order
x_valid_processed = x_valid_processed.reindex(columns=x_train_filtered.columns)
In [86]:
# Checking the order of columns of training & validation:

print("Columns in x_train_filtered:", x_train_filtered.columns)
print("Columns in x_valid_processed:", x_valid_processed.columns)
Columns in x_train_filtered: Index(['AB', 'AF', 'AM', 'BC', 'BN', 'BQ', 'CD ', 'CR', 'CU', 'CW ', 'DA',
       'DE', 'DH', 'DI', 'DL', 'DU', 'EB', 'EE', 'EH', 'EJ', 'FD ', 'FE', 'FL',
       'GF', 'GL'],
      dtype='object')
Columns in x_valid_processed: Index(['AB', 'AF', 'AM', 'BC', 'BN', 'BQ', 'CD ', 'CR', 'CU', 'CW ', 'DA',
       'DE', 'DH', 'DI', 'DL', 'DU', 'EB', 'EE', 'EH', 'EJ', 'FD ', 'FE', 'FL',
       'GF', 'GL'],
      dtype='object')
In [87]:
# Make predictions on the test set
y_pred_SVM = best_model_SVM.predict(x_valid_processed)
In [88]:
# Evaluate the model using accuracy and classification report
accuracy_SVM = accuracy_score(y_valid_processed, y_pred_SVM)
accuracy_percentage_SVM = accuracy_SVM * 100

# Print the accuracy
print("Accuracy of SVM Model: {:.2f}%".format(accuracy_percentage_SVM))

# Convert the classification report string to a DataFrame
classification_report_SVM = classification_report(y_valid_processed, y_pred_SVM, output_dict=True)
classification_report_SVM_df = pd.DataFrame(classification_report_SVM).transpose()

# Convert classification report values to percentages
classification_report_SVM_df.iloc[:, :-1] *= 100

# Print the classification report DataFrame
print("Support Vector Machine Classification Report:")
print(classification_report_SVM_df)
Accuracy of SVM Model: 86.56%
Support Vector Machine Classification Report:
              precision     recall   f1-score     support
0.0           90.000000  94.117647  92.012780  153.000000
1.0           65.384615  51.515152  57.627119   33.000000
accuracy      86.559140  86.559140  86.559140    0.865591
macro avg     77.692308  72.816399  74.819949  186.000000
weighted avg  85.632754  86.559140  85.912098  186.000000

(B) Decision Tree Model¶

In [89]:
# Importing required libraries for DT:

from sklearn.tree import DecisionTreeClassifier
In [90]:
# Create the Decision Tree classifier
model_DT = DecisionTreeClassifier(random_state=42)

# Define the hyperparameter grid
param_grid_DT = {
    'criterion': ['gini', 'entropy'],
    'max_depth': [None, 10, 20, 30],
    'min_samples_split': [2, 5, 10],
    'min_samples_leaf': [1, 2, 4],
}

# Create k-fold cross-validation object (here, k=5)
kfold = KFold(n_splits=5, shuffle=True, random_state=42)

# Create the GridSearchCV object with k-fold cross-validation
grid_search_DT = GridSearchCV(model_DT, param_grid_DT, cv=kfold)
In [91]:
# Perform grid search with k-fold cross-validation to find the best hyperparameters of DT Model

grid_search_DT.fit(x_train_filtered, y_train_processed)
Out[91]:
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=DecisionTreeClassifier(random_state=42),
             param_grid={'criterion': ['gini', 'entropy'],
                         'max_depth': [None, 10, 20, 30],
                         'min_samples_leaf': [1, 2, 4],
                         'min_samples_split': [2, 5, 10]})
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=DecisionTreeClassifier(random_state=42),
             param_grid={'criterion': ['gini', 'entropy'],
                         'max_depth': [None, 10, 20, 30],
                         'min_samples_leaf': [1, 2, 4],
                         'min_samples_split': [2, 5, 10]})
DecisionTreeClassifier(random_state=42)
DecisionTreeClassifier(random_state=42)
In [92]:
# Retrieve the best model with optimized hyperparameters: Decision Tree

best_model_DT = grid_search_DT.best_estimator_
best_params_DT = grid_search_DT.best_params_
print(best_params_DT)
{'criterion': 'entropy', 'max_depth': None, 'min_samples_leaf': 1, 'min_samples_split': 10}
In [93]:
# Train the best model on the full training set
best_model_DT.fit(x_train_filtered, y_train_processed)

# Make predictions on the test set
y_pred_DT = best_model_DT.predict(x_valid_processed)
In [94]:
# Evaluate the model using accuracy and classification report
accuracy_DT = accuracy_score(y_valid_processed, y_pred_DT)
accuracy_percentage_DT = accuracy_DT * 100

# Print the accuracy
print("Accuracy of DT Model: {:.2f}%".format(accuracy_percentage_DT))

# Convert the classification report string to a DataFrame
classification_report_DT = classification_report(y_valid_processed, y_pred_DT, output_dict=True)
classification_report_DT_df = pd.DataFrame(classification_report_DT).transpose()

# Convert classification report values to percentages
classification_report_DT_df.iloc[:, :-1] *= 100

# Print the classification report DataFrame
print("Decision Tree Classification Report:")
print(classification_report_DT_df)
Accuracy of DT Model: 86.02%
Decision Tree Classification Report:
              precision     recall   f1-score     support
0.0           93.793103  88.888889  91.275168  153.000000
1.0           58.536585  72.727273  64.864865   33.000000
accuracy      86.021505  86.021505  86.021505    0.860215
macro avg     76.164844  80.808081  78.070016  186.000000
weighted avg  87.537915  86.021505  86.589469  186.000000

(C) Random Forest Model¶

In [95]:
# importing required libraries for RF:

from sklearn.ensemble import RandomForestClassifier
In [96]:
# Create the Random Forest classifier
model_RF = RandomForestClassifier(random_state=42)

# Define the hyperparameter grid
param_grid_RF = {
    'n_estimators': [300], # Number of trees in the forest
    'criterion': ['gini'],
    'max_depth': [30],
    'min_samples_split': [5],
    'min_samples_leaf': [2],
}

# Create k-fold cross-validation object (here, k=5)
kfold = KFold(n_splits=5, shuffle=True, random_state=42)

# Create the GridSearchCV object with k-fold cross-validation
grid_search_RF = GridSearchCV(model_RF, param_grid_RF, cv=kfold)
In [97]:
# Perform grid search with k-fold cross-validation to find the best hyperparameters of the Random Forest Model

grid_search_RF.fit(x_train_filtered, y_train_processed)
Out[97]:
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=RandomForestClassifier(random_state=42),
             param_grid={'criterion': ['gini'], 'max_depth': [30],
                         'min_samples_leaf': [2], 'min_samples_split': [5],
                         'n_estimators': [300]})
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=RandomForestClassifier(random_state=42),
             param_grid={'criterion': ['gini'], 'max_depth': [30],
                         'min_samples_leaf': [2], 'min_samples_split': [5],
                         'n_estimators': [300]})
RandomForestClassifier(random_state=42)
RandomForestClassifier(random_state=42)
In [98]:
# Retrieve the best model with optimized hyperparameters: Random Forest

best_model_RF = grid_search_RF.best_estimator_
best_params_RF = grid_search_RF.best_params_
print(best_params_RF)
{'criterion': 'gini', 'max_depth': 30, 'min_samples_leaf': 2, 'min_samples_split': 5, 'n_estimators': 300}
In [99]:
# Train the best model on the full training set
best_model_RF.fit(x_train_filtered, y_train_processed)

# Make predictions on the test set
y_pred_RF = best_model_RF.predict(x_valid_processed)
In [100]:
# Evaluate the model using accuracy and classification report
accuracy_RF = accuracy_score(y_valid_processed, y_pred_RF)
accuracy_percentage_RF = accuracy_RF * 100

# Print the accuracy
print("Accuracy of Random Forest Model: {:.2f}%".format(accuracy_percentage_RF))

# Convert the classification report string to a DataFrame
classification_report_RF = classification_report(y_valid_processed, y_pred_RF, output_dict=True)
classification_report_RF_df = pd.DataFrame(classification_report_RF).transpose()

# Convert classification report values to percentages
classification_report_RF_df.iloc[:, :-1] *= 100

# Print the classification report DataFrame
print("Random Forest Classification Report:")
print(classification_report_RF_df)
Accuracy of Random Forest Model: 90.32%
Random Forest Classification Report:
              precision     recall   f1-score     support
0.0           95.918367  92.156863  94.000000  153.000000
1.0           69.230769  81.818182  75.000000   33.000000
accuracy      90.322581  90.322581  90.322581    0.903226
macro avg     82.574568  86.987522  84.500000  186.000000
weighted avg  91.183471  90.322581  90.629032  186.000000

(D) Logistic Regression Model¶

In [101]:
# Importing required libraries for logistic regression:

from sklearn.linear_model import LogisticRegression
In [102]:
# Create the Logistic Regression classifier
model_LR = LogisticRegression(random_state=42)

# Define the hyperparameter grid
param_grid_LR = {
    'C': [0.1, 1.0, 10.0],  # Inverse of regularization strength
    'penalty': ['l1', 'l2'], # Regularization type (L1 or L2)
    'solver': ['liblinear', 'saga'] # Solver to use for optimization (depending on the regularization)
}

# Create k-fold cross-validation object (here, k=5)
kfold = KFold(n_splits=5, shuffle=True, random_state=42)

# Create the GridSearchCV object with k-fold cross-validation
grid_search_LR = GridSearchCV(model_LR, param_grid_LR, cv=kfold)
In [103]:
# Perform grid search with k-fold cross-validation to find the best hyperparameters of the Logistic Regression Model

grid_search_LR.fit(x_train_filtered, y_train_processed)
Out[103]:
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=LogisticRegression(random_state=42),
             param_grid={'C': [0.1, 1.0, 10.0], 'penalty': ['l1', 'l2'],
                         'solver': ['liblinear', 'saga']})
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=LogisticRegression(random_state=42),
             param_grid={'C': [0.1, 1.0, 10.0], 'penalty': ['l1', 'l2'],
                         'solver': ['liblinear', 'saga']})
LogisticRegression(random_state=42)
LogisticRegression(random_state=42)
In [104]:
# Retrieve the best model with optimized hyperparameters: Logistic Regression

best_model_LR = grid_search_LR.best_estimator_
best_params_LR = grid_search_LR.best_params_
print(best_params_LR)
{'C': 10.0, 'penalty': 'l1', 'solver': 'liblinear'}
In [105]:
# Train the best model on the full training set
best_model_LR.fit(x_train_filtered, y_train_processed)

# Make predictions on the test set
y_pred_LR = best_model_LR.predict(x_valid_processed)
In [106]:
# Evaluate the model using accuracy and classification report
accuracy_LR = accuracy_score(y_valid_processed, y_pred_LR)
accuracy_percentage_LR = accuracy_LR * 100

# Print the accuracy
print("Accuracy of Logistic Regression Model: {:.2f}%".format(accuracy_percentage_LR))

# Convert the classification report string to a DataFrame
classification_report_LR = classification_report(y_valid_processed, y_pred_LR, output_dict=True)
classification_report_LR_df = pd.DataFrame(classification_report_LR).transpose()

# Convert classification report values to percentages
classification_report_LR_df.iloc[:, :-1] *= 100

# Print the classification report DataFrame
print("Logistic Regression Classification Report:")
print(classification_report_LR_df)
Accuracy of Logistic Regression Model: 86.02%
Logistic Regression Classification Report:
              precision     recall   f1-score     support
0.0           95.035461  87.581699  91.156463  153.000000
1.0           57.777778  78.787879  66.666667   33.000000
accuracy      86.021505  86.021505  86.021505    0.860215
macro avg     76.406619  83.184789  78.911565  186.000000
weighted avg  88.425227  86.021505  86.811499  186.000000

(E) K-Nearest Neighbors Model (KNN)¶

In [107]:
# Importing required libraries for KNN:

from sklearn.neighbors import KNeighborsClassifier
In [108]:
# Create the K-Nearest Neighbors classifier
model_KNN = KNeighborsClassifier()

# Define the hyperparameter grid
param_grid_KNN = {
    'n_neighbors': [3, 5, 7],  # Number of neighbors to consider
    'weights': ['uniform', 'distance'],  # Weight function used in prediction
    'p': [1, 2]  # Power parameter for the Minkowski distance
}

# Create k-fold cross-validation object (here, k=5)
kfold = KFold(n_splits=5, shuffle=True, random_state=42)

# Create the GridSearchCV object with k-fold cross-validation
grid_search_KNN = GridSearchCV(model_KNN, param_grid_KNN, cv=kfold)
In [109]:
# Perform grid search with k-fold cross-validation to find the best hyperparameters

grid_search_KNN.fit(x_train_filtered, y_train_processed)
Out[109]:
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=KNeighborsClassifier(),
             param_grid={'n_neighbors': [3, 5, 7], 'p': [1, 2],
                         'weights': ['uniform', 'distance']})
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=KNeighborsClassifier(),
             param_grid={'n_neighbors': [3, 5, 7], 'p': [1, 2],
                         'weights': ['uniform', 'distance']})
KNeighborsClassifier()
KNeighborsClassifier()
In [110]:
# Retrieve the best model with optimized hyperparameters: K-Nearest Neighbors

best_model_KNN = grid_search_KNN.best_estimator_
best_params_KNN = grid_search_KNN.best_params_
print(best_params_KNN)
{'n_neighbors': 7, 'p': 1, 'weights': 'distance'}
In [111]:
# Train the best model on the full training set
best_model_KNN.fit(x_train_filtered, y_train_processed)

# Make predictions on the test set
y_pred_KNN = best_model_KNN.predict(x_valid_processed)
In [112]:
# Evaluate the model using accuracy and classification report
accuracy_KNN = accuracy_score(y_valid_processed, y_pred_KNN)
accuracy_percentage_KNN = accuracy_KNN * 100

# Print the accuracy
print("Accuracy of K-Nearest Neighbors Model: {:.2f}%".format(accuracy_percentage_KNN))

# Convert the classification report string to a DataFrame
classification_report_KNN = classification_report(y_valid_processed, y_pred_KNN, output_dict=True)
classification_report_KNN_df = pd.DataFrame(classification_report_KNN).transpose()

# Convert classification report values to percentages
classification_report_KNN_df.iloc[:, :-1] *= 100

# Print the classification report DataFrame
print("K-Nearest Neighbors Classification Report:")
print(classification_report_KNN_df)
Accuracy of K-Nearest Neighbors Model: 84.95%
K-Nearest Neighbors Classification Report:
              precision     recall   f1-score     support
0.0           96.992481  84.313725  90.209790  153.000000
1.0           54.716981  87.878788  67.441860   33.000000
accuracy      84.946237  84.946237  84.946237    0.849462
macro avg     75.854731  86.096257  78.825825  186.000000
weighted avg  89.491989  84.946237  86.170319  186.000000

(F) Naive Bayes Model¶

In [113]:
# Importing required libraries for NB:

from sklearn.naive_bayes import GaussianNB
In [114]:
# Create the Naive Bayes classifier
model_NB = GaussianNB()
In [115]:
# Train the Naive Bayes model on the training set
model_NB.fit(x_train_filtered, y_train_processed)

# Make predictions on the test set
y_pred_NB = model_NB.predict(x_valid_processed)
In [116]:
# Evaluate the model using accuracy and classification report
accuracy_NB = accuracy_score(y_valid_processed, y_pred_NB)
accuracy_percentage_NB = accuracy_NB * 100

# Print the accuracy
print("Accuracy of Naive Bayes Model: {:.2f}%".format(accuracy_percentage_NB))

# Convert the classification report string to a DataFrame
classification_report_NB = classification_report(y_valid_processed, y_pred_NB, output_dict=True)
classification_report_NB_df = pd.DataFrame(classification_report_NB).transpose()

# Convert classification report values to percentages
classification_report_NB_df.iloc[:, :-1] *= 100

# Print the classification report DataFrame
print("Naive Bayes Classification Report:")
print(classification_report_NB_df)
Accuracy of Naive Bayes Model: 87.63%
Naive Bayes Classification Report:
              precision     recall   f1-score     support
0.0           91.139241  94.117647  92.604502  153.000000
1.0           67.857143  57.575758  62.295082   33.000000
accuracy      87.634409  87.634409  87.634409    0.876344
macro avg     79.498192  75.846702  77.449792  186.000000
weighted avg  87.008546  87.634409  87.227024  186.000000

(G) XGBoost Model¶

In [117]:
!pip install xgboost
Requirement already satisfied: xgboost in c:\users\apara\anaconda3\lib\site-packages (1.7.6)
Requirement already satisfied: scipy in c:\users\apara\anaconda3\lib\site-packages (from xgboost) (1.7.3)
Requirement already satisfied: numpy in c:\users\apara\anaconda3\lib\site-packages (from xgboost) (1.22.1)
In [118]:
# Importing required libraries for XGBoost:

import xgboost as xgb
In [119]:
# Create the XGBoost classifier

model_XGB = xgb.XGBClassifier(random_state=42)


# Define the hyperparameter grid

param_grid_XGB = {
    'n_estimators': [100, 200, 300],  # Number of boosting rounds
    'learning_rate': [0.1, 0.01, 0.001],  # Step size shrinkage to prevent overfitting
    'max_depth': [3, 5, 7],  # Maximum depth of a tree
}

# Create k-fold cross-validation object (here, k=5)
kfold = KFold(n_splits=5, shuffle=True, random_state=42)

# Create the GridSearchCV object with k-fold cross-validation
grid_search_XGB = GridSearchCV(model_XGB, param_grid_XGB, cv=kfold)
In [120]:
# Perform grid search with k-fold cross-validation to find the best hyperparameters of the XGBoost Model

grid_search_XGB.fit(x_train_filtered, y_train_processed)
Out[120]:
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=XGBClassifier(base_score=None, booster=None,
                                     callbacks=None, colsample_bylevel=None,
                                     colsample_bynode=None,
                                     colsample_bytree=None,
                                     early_stopping_rounds=None,
                                     enable_categorical=False, eval_metric=None,
                                     feature_types=None, gamma=None,
                                     gpu_id=None, grow_policy=None,
                                     importance_type=None,
                                     int...
                                     learning_rate=None, max_bin=None,
                                     max_cat_threshold=None,
                                     max_cat_to_onehot=None,
                                     max_delta_step=None, max_depth=None,
                                     max_leaves=None, min_child_weight=None,
                                     missing=nan, monotone_constraints=None,
                                     n_estimators=100, n_jobs=None,
                                     num_parallel_tree=None, predictor=None,
                                     random_state=42, ...),
             param_grid={'learning_rate': [0.1, 0.01, 0.001],
                         'max_depth': [3, 5, 7],
                         'n_estimators': [100, 200, 300]})
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=XGBClassifier(base_score=None, booster=None,
                                     callbacks=None, colsample_bylevel=None,
                                     colsample_bynode=None,
                                     colsample_bytree=None,
                                     early_stopping_rounds=None,
                                     enable_categorical=False, eval_metric=None,
                                     feature_types=None, gamma=None,
                                     gpu_id=None, grow_policy=None,
                                     importance_type=None,
                                     int...
                                     learning_rate=None, max_bin=None,
                                     max_cat_threshold=None,
                                     max_cat_to_onehot=None,
                                     max_delta_step=None, max_depth=None,
                                     max_leaves=None, min_child_weight=None,
                                     missing=nan, monotone_constraints=None,
                                     n_estimators=100, n_jobs=None,
                                     num_parallel_tree=None, predictor=None,
                                     random_state=42, ...),
             param_grid={'learning_rate': [0.1, 0.01, 0.001],
                         'max_depth': [3, 5, 7],
                         'n_estimators': [100, 200, 300]})
XGBClassifier(base_score=None, booster=None, callbacks=None,
              colsample_bylevel=None, colsample_bynode=None,
              colsample_bytree=None, early_stopping_rounds=None,
              enable_categorical=False, eval_metric=None, feature_types=None,
              gamma=None, gpu_id=None, grow_policy=None, importance_type=None,
              interaction_constraints=None, learning_rate=None, max_bin=None,
              max_cat_threshold=None, max_cat_to_onehot=None,
              max_delta_step=None, max_depth=None, max_leaves=None,
              min_child_weight=None, missing=nan, monotone_constraints=None,
              n_estimators=100, n_jobs=None, num_parallel_tree=None,
              predictor=None, random_state=42, ...)
XGBClassifier(base_score=None, booster=None, callbacks=None,
              colsample_bylevel=None, colsample_bynode=None,
              colsample_bytree=None, early_stopping_rounds=None,
              enable_categorical=False, eval_metric=None, feature_types=None,
              gamma=None, gpu_id=None, grow_policy=None, importance_type=None,
              interaction_constraints=None, learning_rate=None, max_bin=None,
              max_cat_threshold=None, max_cat_to_onehot=None,
              max_delta_step=None, max_depth=None, max_leaves=None,
              min_child_weight=None, missing=nan, monotone_constraints=None,
              n_estimators=100, n_jobs=None, num_parallel_tree=None,
              predictor=None, random_state=42, ...)
In [121]:
# Retrieve the best model with optimized hyperparameters: XGBoost

best_model_XGB = grid_search_XGB.best_estimator_
best_params_XGB = grid_search_XGB.best_params_
print(best_params_XGB)
{'learning_rate': 0.1, 'max_depth': 3, 'n_estimators': 200}
In [122]:
# Train the best model on the full training set
best_model_XGB.fit(x_train_filtered, y_train_processed)

# Make predictions on the test set
y_pred_XGB = best_model_XGB.predict(x_valid_processed)
In [123]:
# Evaluate the model using accuracy and classification report
accuracy_XGB = accuracy_score(y_valid_processed, y_pred_XGB)
accuracy_percentage_XGB = accuracy_XGB * 100

# Print the accuracy
print("Accuracy of XGBoost Model: {:.2f}%".format(accuracy_percentage_XGB))

# Convert the classification report string to a DataFrame
classification_report_XGB = classification_report(y_valid_processed, y_pred_XGB, output_dict=True)
classification_report_XGB_df = pd.DataFrame(classification_report_XGB).transpose()

# Convert classification report values to percentages
classification_report_XGB_df.iloc[:, :-1] *= 100

# Print the classification report DataFrame
print("XGBoost Classification Report:")
print(classification_report_XGB_df)
Accuracy of XGBoost Model: 90.32%
XGBoost Classification Report:
              precision     recall   f1-score     support
0.0           95.918367  92.156863  94.000000  153.000000
1.0           69.230769  81.818182  75.000000   33.000000
accuracy      90.322581  90.322581  90.322581    0.903226
macro avg     82.574568  86.987522  84.500000  186.000000
weighted avg  91.183471  90.322581  90.629032  186.000000

(H) Cat Boost Model¶

In [124]:
!pip install catboost
Requirement already satisfied: catboost in c:\users\apara\anaconda3\lib\site-packages (1.2)
Requirement already satisfied: scipy in c:\users\apara\anaconda3\lib\site-packages (from catboost) (1.7.3)
Requirement already satisfied: matplotlib in c:\users\apara\anaconda3\lib\site-packages (from catboost) (3.7.0)
Requirement already satisfied: numpy>=1.16.0 in c:\users\apara\anaconda3\lib\site-packages (from catboost) (1.22.1)
Requirement already satisfied: graphviz in c:\users\apara\anaconda3\lib\site-packages (from catboost) (0.20.1)
Requirement already satisfied: six in c:\users\apara\anaconda3\lib\site-packages (from catboost) (1.16.0)
Requirement already satisfied: pandas>=0.24 in c:\users\apara\anaconda3\lib\site-packages (from catboost) (1.5.3)
Requirement already satisfied: plotly in c:\users\apara\anaconda3\lib\site-packages (from catboost) (5.9.0)
Requirement already satisfied: pytz>=2020.1 in c:\users\apara\anaconda3\lib\site-packages (from pandas>=0.24->catboost) (2022.7)
Requirement already satisfied: python-dateutil>=2.8.1 in c:\users\apara\anaconda3\lib\site-packages (from pandas>=0.24->catboost) (2.8.2)
Requirement already satisfied: cycler>=0.10 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->catboost) (0.11.0)
Requirement already satisfied: kiwisolver>=1.0.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->catboost) (1.4.4)
Requirement already satisfied: contourpy>=1.0.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->catboost) (1.0.5)
Requirement already satisfied: packaging>=20.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->catboost) (22.0)
Requirement already satisfied: pillow>=6.2.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->catboost) (9.4.0)
Requirement already satisfied: pyparsing>=2.3.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->catboost) (3.0.9)
Requirement already satisfied: fonttools>=4.22.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib->catboost) (4.25.0)
Requirement already satisfied: tenacity>=6.2.0 in c:\users\apara\anaconda3\lib\site-packages (from plotly->catboost) (8.0.1)
In [125]:
# Importing required libraries for Cat Boost:

from catboost import CatBoostClassifier
In [126]:
# Create the CatBoost classifier
model_CatBoost = CatBoostClassifier(random_seed=42, verbose=False)

# Define the hyperparameter grid
param_grid_CatBoost = {
    'iterations': [100, 200, 300],  # Number of boosting rounds
    'learning_rate': [0.1, 0.01, 0.001],  # Step size shrinkage to prevent overfitting
    'depth': [3, 5, 7],  # Maximum depth of a tree
}

# Create k-fold cross-validation object (here, k=5)
kfold = KFold(n_splits=5, shuffle=True, random_state=42)

# Create the GridSearchCV object with k-fold cross-validation
grid_search_CatBoost = GridSearchCV(model_CatBoost, param_grid_CatBoost, cv=kfold)
In [127]:
# Perform grid search with k-fold cross-validation to find the best hyperparameters of the CatBoost Model

grid_search_CatBoost.fit(x_train_filtered, y_train_processed)
Out[127]:
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=<catboost.core.CatBoostClassifier object at 0x0000020C0B9D56C0>,
             param_grid={'depth': [3, 5, 7], 'iterations': [100, 200, 300],
                         'learning_rate': [0.1, 0.01, 0.001]})
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
GridSearchCV(cv=KFold(n_splits=5, random_state=42, shuffle=True),
             estimator=<catboost.core.CatBoostClassifier object at 0x0000020C0B9D56C0>,
             param_grid={'depth': [3, 5, 7], 'iterations': [100, 200, 300],
                         'learning_rate': [0.1, 0.01, 0.001]})
<catboost.core.CatBoostClassifier object at 0x0000020C0B9D56C0>
<catboost.core.CatBoostClassifier object at 0x0000020C0B9D56C0>
In [128]:
# Retrieve the best model with optimized hyperparameters: CatBoost

best_model_CatBoost = grid_search_CatBoost.best_estimator_
best_params_CatBoost = grid_search_CatBoost.best_params_
print(best_params_CatBoost)
{'depth': 7, 'iterations': 200, 'learning_rate': 0.1}
In [129]:
# Train the best model on the full training set
best_model_CatBoost.fit(x_train_filtered, y_train_processed)

# Make predictions on the test set
y_pred_CatBoost = best_model_CatBoost.predict(x_valid_processed)
In [130]:
# Evaluate the model using accuracy and classification report
accuracy_CatBoost = accuracy_score(y_valid_processed, y_pred_CatBoost)
accuracy_percentage_CatBoost = accuracy_CatBoost * 100

# Print the accuracy
print("Accuracy of CatBoost Model: {:.2f}%".format(accuracy_percentage_CatBoost))

# Convert the classification report string to a DataFrame
classification_report_CatBoost = classification_report(y_valid_processed, y_pred_CatBoost, output_dict=True)
classification_report_CatBoost_df = pd.DataFrame(classification_report_CatBoost).transpose()

# Convert classification report values to percentages
classification_report_CatBoost_df.iloc[:, :-1] *= 100

# Print the classification report DataFrame
print("CatBoost Classification Report:")
print(classification_report_CatBoost_df)
Accuracy of CatBoost Model: 88.71%
CatBoost Classification Report:
              precision     recall   f1-score     support
0.0           95.205479  90.849673  92.976589  153.000000
1.0           65.000000  78.787879  71.232877   33.000000
accuracy      88.709677  88.709677  88.709677    0.887097
macro avg     80.102740  84.818776  82.104733  186.000000
weighted avg  89.846443  88.709677  89.118833  186.000000

Comparing all the models on the basis of Accuracy:¶

In [134]:
pip install matplotlib
Requirement already satisfied: matplotlib in c:\users\apara\anaconda3\lib\site-packages (3.7.0)
Requirement already satisfied: contourpy>=1.0.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (1.0.5)
Requirement already satisfied: numpy>=1.20 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (1.22.1)
Requirement already satisfied: fonttools>=4.22.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (4.25.0)
Requirement already satisfied: pyparsing>=2.3.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (3.0.9)
Requirement already satisfied: cycler>=0.10 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (0.11.0)
Requirement already satisfied: kiwisolver>=1.0.1 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (1.4.4)
Requirement already satisfied: packaging>=20.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (22.0)
Requirement already satisfied: pillow>=6.2.0 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (9.4.0)
Requirement already satisfied: python-dateutil>=2.7 in c:\users\apara\anaconda3\lib\site-packages (from matplotlib) (2.8.2)
Requirement already satisfied: six>=1.5 in c:\users\apara\anaconda3\lib\site-packages (from python-dateutil>=2.7->matplotlib) (1.16.0)
Note: you may need to restart the kernel to use updated packages.
In [135]:
# List of model names and their corresponding accuracies
model_names = ['Logistic Regression', 'Naive Bayes', 'Decision Tree', 'Random Forest', 'K-Nearest Neighbors', 'XGBoost', 'CatBoost']
accuracies = [accuracy_percentage_LR, accuracy_percentage_NB, accuracy_percentage_DT, accuracy_percentage_RF,accuracy_percentage_KNN, accuracy_percentage_XGB, accuracy_percentage_CatBoost]

# Find the index of the highest accuracy
highest_accuracy_index = np.argmax(accuracies)

# Plotting the accuracy of different models
plt.figure(figsize=(10, 6))
bars = plt.bar(model_names, accuracies, color='MediumBlue')
plt.ylim(0, 100)
plt.xlabel('Models')
plt.ylabel('Accuracy (%)')
plt.title('Model Accuracy Comparison')
plt.xticks(rotation=45, ha='right')

# Displaying the accuracy values above each bar
for i, v in enumerate(accuracies):
    plt.text(i, v + 1, "{:.2f}%".format(v), ha='center', va='bottom', fontsize=9, fontweight='bold')

# Set a different color for the highest accuracy bar
bars[highest_accuracy_index].set_color('Red')

plt.tight_layout()
plt.show()

Prediction on 'Test' Dataset:¶

In [147]:
# Checking the features of test dataset:

test_df.head()
Out[147]:
Id AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EJ EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL
0 00eed32682bb 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
1 010ebe33f668 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
2 02fa521e1838 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
3 040e15f562a2 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
4 046e85c7cc7f 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
In [148]:
# checking datatypes & null values of test dataset:

test_df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 5 entries, 0 to 4
Data columns (total 57 columns):
 #   Column  Non-Null Count  Dtype  
---  ------  --------------  -----  
 0   Id      5 non-null      object 
 1   AB      5 non-null      float64
 2   AF      5 non-null      float64
 3   AH      5 non-null      float64
 4   AM      5 non-null      float64
 5   AR      5 non-null      float64
 6   AX      5 non-null      float64
 7   AY      5 non-null      float64
 8   AZ      5 non-null      float64
 9   BC      5 non-null      float64
 10  BD      5 non-null      float64
 11  BN      5 non-null      float64
 12  BP      5 non-null      float64
 13  BQ      5 non-null      float64
 14  BR      5 non-null      float64
 15  BZ      5 non-null      float64
 16  CB      5 non-null      float64
 17  CC      5 non-null      float64
 18  CD      5 non-null      float64
 19  CF      5 non-null      float64
 20  CH      5 non-null      float64
 21  CL      5 non-null      float64
 22  CR      5 non-null      float64
 23  CS      5 non-null      float64
 24  CU      5 non-null      float64
 25  CW      5 non-null      float64
 26  DA      5 non-null      float64
 27  DE      5 non-null      float64
 28  DF      5 non-null      float64
 29  DH      5 non-null      float64
 30  DI      5 non-null      float64
 31  DL      5 non-null      float64
 32  DN      5 non-null      float64
 33  DU      5 non-null      float64
 34  DV      5 non-null      float64
 35  DY      5 non-null      float64
 36  EB      5 non-null      float64
 37  EE      5 non-null      float64
 38  EG      5 non-null      float64
 39  EH      5 non-null      float64
 40  EJ      5 non-null      object 
 41  EL      5 non-null      float64
 42  EP      5 non-null      float64
 43  EU      5 non-null      float64
 44  FC      5 non-null      float64
 45  FD      5 non-null      float64
 46  FE      5 non-null      float64
 47  FI      5 non-null      float64
 48  FL      5 non-null      float64
 49  FR      5 non-null      float64
 50  FS      5 non-null      float64
 51  GB      5 non-null      float64
 52  GE      5 non-null      float64
 53  GF      5 non-null      float64
 54  GH      5 non-null      float64
 55  GI      5 non-null      float64
 56  GL      5 non-null      float64
dtypes: float64(55), object(2)
memory usage: 2.4+ KB
In [149]:
# Dropping the column of id as not required:

test_x = test_df.drop("Id", axis = 1)
test_x.head()
Out[149]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EJ EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL
0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
1 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
2 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
3 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
4 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 A 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
In [150]:
# Fit and transform the 'EJ' column in the train dataset
test_x['EJ'] = label_encoder.fit_transform(test_x['EJ'])

# Print the encoded DataFrame
test_x.head()
Out[150]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EJ EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL
0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
1 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
2 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
3 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
4 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
In [151]:
# Assuming x_train.columns and test_x.columns are not in the same order

test_x = test_x.reindex(columns=x_train.columns)
In [152]:
# Scale the test_x data using the scaler
test_x_scaled = scaler.transform(test_x)

# Convert the scaled array back to a DataFrame
test_x_scaled_df = pd.DataFrame(test_x_scaled, columns=test_x.columns)

# Print the scaled DataFrame
test_x_scaled_df.head()
Out[152]:
AB AF AH AM AR AX AY AZ BC BD BN BP BQ BR BZ CB CC CD CF CH CL CR CS CU CW DA DE DF DH DI DL DN DU DV DY EB EE EG EH EL EP EU FC FD FE FI FL FR FS GB GE GF GH GI GL EJ
0 -0.953938 -1.68874 -0.980614 -0.556082 -1.089497 -2.082763 -0.13559 -2.549916 -0.122038 -1.579278 -6.111085 -1.184475 -1.077923 -0.339968 -0.235943 -0.526381 -2.671424 -1.689007 -0.906916 -2.148923 -0.797822 -2.641951 -2.416079 -2.697119 -1.802673 -2.556235 -1.250301 -0.29423 -3.300414 -1.66411 -3.236556 -3.471453 -0.218693 -1.47925 -1.444573 -1.545561 -1.533729 -0.937579 -0.433332 -1.839692 -1.724393 -0.331081 -0.568678 -0.410914 -0.862248 -3.40328 -0.496826 -0.074137 -0.283943 -2.020124 -0.920747 -0.781901 -3.150813 -1.402969 -0.832533 -1.334408
1 -0.953938 -1.68874 -0.980614 -0.556082 -1.089497 -2.082763 -0.13559 -2.549916 -0.122038 -1.579278 -6.111085 -1.184475 -1.077923 -0.339968 -0.235943 -0.526381 -2.671424 -1.689007 -0.906916 -2.148923 -0.797822 -2.641951 -2.416079 -2.697119 -1.802673 -2.556235 -1.250301 -0.29423 -3.300414 -1.66411 -3.236556 -3.471453 -0.218693 -1.47925 -1.444573 -1.545561 -1.533729 -0.937579 -0.433332 -1.839692 -1.724393 -0.331081 -0.568678 -0.410914 -0.862248 -3.40328 -0.496826 -0.074137 -0.283943 -2.020124 -0.920747 -0.781901 -3.150813 -1.402969 -0.832533 -1.334408
2 -0.953938 -1.68874 -0.980614 -0.556082 -1.089497 -2.082763 -0.13559 -2.549916 -0.122038 -1.579278 -6.111085 -1.184475 -1.077923 -0.339968 -0.235943 -0.526381 -2.671424 -1.689007 -0.906916 -2.148923 -0.797822 -2.641951 -2.416079 -2.697119 -1.802673 -2.556235 -1.250301 -0.29423 -3.300414 -1.66411 -3.236556 -3.471453 -0.218693 -1.47925 -1.444573 -1.545561 -1.533729 -0.937579 -0.433332 -1.839692 -1.724393 -0.331081 -0.568678 -0.410914 -0.862248 -3.40328 -0.496826 -0.074137 -0.283943 -2.020124 -0.920747 -0.781901 -3.150813 -1.402969 -0.832533 -1.334408
3 -0.953938 -1.68874 -0.980614 -0.556082 -1.089497 -2.082763 -0.13559 -2.549916 -0.122038 -1.579278 -6.111085 -1.184475 -1.077923 -0.339968 -0.235943 -0.526381 -2.671424 -1.689007 -0.906916 -2.148923 -0.797822 -2.641951 -2.416079 -2.697119 -1.802673 -2.556235 -1.250301 -0.29423 -3.300414 -1.66411 -3.236556 -3.471453 -0.218693 -1.47925 -1.444573 -1.545561 -1.533729 -0.937579 -0.433332 -1.839692 -1.724393 -0.331081 -0.568678 -0.410914 -0.862248 -3.40328 -0.496826 -0.074137 -0.283943 -2.020124 -0.920747 -0.781901 -3.150813 -1.402969 -0.832533 -1.334408
4 -0.953938 -1.68874 -0.980614 -0.556082 -1.089497 -2.082763 -0.13559 -2.549916 -0.122038 -1.579278 -6.111085 -1.184475 -1.077923 -0.339968 -0.235943 -0.526381 -2.671424 -1.689007 -0.906916 -2.148923 -0.797822 -2.641951 -2.416079 -2.697119 -1.802673 -2.556235 -1.250301 -0.29423 -3.300414 -1.66411 -3.236556 -3.471453 -0.218693 -1.47925 -1.444573 -1.545561 -1.533729 -0.937579 -0.433332 -1.839692 -1.724393 -0.331081 -0.568678 -0.410914 -0.862248 -3.40328 -0.496826 -0.074137 -0.283943 -2.020124 -0.920747 -0.781901 -3.150813 -1.402969 -0.832533 -1.334408
In [153]:
# checking test_x_scaled_df dataframe

test_x_scaled_df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 5 entries, 0 to 4
Data columns (total 56 columns):
 #   Column  Non-Null Count  Dtype  
---  ------  --------------  -----  
 0   AB      5 non-null      float64
 1   AF      5 non-null      float64
 2   AH      5 non-null      float64
 3   AM      5 non-null      float64
 4   AR      5 non-null      float64
 5   AX      5 non-null      float64
 6   AY      5 non-null      float64
 7   AZ      5 non-null      float64
 8   BC      5 non-null      float64
 9   BD      5 non-null      float64
 10  BN      5 non-null      float64
 11  BP      5 non-null      float64
 12  BQ      5 non-null      float64
 13  BR      5 non-null      float64
 14  BZ      5 non-null      float64
 15  CB      5 non-null      float64
 16  CC      5 non-null      float64
 17  CD      5 non-null      float64
 18  CF      5 non-null      float64
 19  CH      5 non-null      float64
 20  CL      5 non-null      float64
 21  CR      5 non-null      float64
 22  CS      5 non-null      float64
 23  CU      5 non-null      float64
 24  CW      5 non-null      float64
 25  DA      5 non-null      float64
 26  DE      5 non-null      float64
 27  DF      5 non-null      float64
 28  DH      5 non-null      float64
 29  DI      5 non-null      float64
 30  DL      5 non-null      float64
 31  DN      5 non-null      float64
 32  DU      5 non-null      float64
 33  DV      5 non-null      float64
 34  DY      5 non-null      float64
 35  EB      5 non-null      float64
 36  EE      5 non-null      float64
 37  EG      5 non-null      float64
 38  EH      5 non-null      float64
 39  EL      5 non-null      float64
 40  EP      5 non-null      float64
 41  EU      5 non-null      float64
 42  FC      5 non-null      float64
 43  FD      5 non-null      float64
 44  FE      5 non-null      float64
 45  FI      5 non-null      float64
 46  FL      5 non-null      float64
 47  FR      5 non-null      float64
 48  FS      5 non-null      float64
 49  GB      5 non-null      float64
 50  GE      5 non-null      float64
 51  GF      5 non-null      float64
 52  GH      5 non-null      float64
 53  GI      5 non-null      float64
 54  GL      5 non-null      float64
 55  EJ      5 non-null      float64
dtypes: float64(56)
memory usage: 2.3 KB
In [154]:
# Assuming x_train_filtered.columns and test_x_scaled_df.columns are not in the same order

test_x_scaled_df = test_x_scaled_df.reindex(columns=x_train_filtered.columns)
In [155]:
# Predicting the outputs on the features of test_x_scaled_df

test_predictions = best_model_RF.predict_proba(test_x_scaled_df)
test_preds = test_predictions[:, 1]
test_preds
Out[155]:
array([0.3309246, 0.3309246, 0.3309246, 0.3309246, 0.3309246])
In [156]:
# Probability of the class to be 0 or 1 for the given test dataset:

sub_df = pd.DataFrame({'Id': test_df['Id'], 'Class_0': 1-test_preds, 'Class_1': test_preds})
sub_df.to_csv('test.csv', index=False)
sub_df.head()
Out[156]:
Id Class_0 Class_1
0 00eed32682bb 0.669075 0.330925
1 010ebe33f668 0.669075 0.330925
2 02fa521e1838 0.669075 0.330925
3 040e15f562a2 0.669075 0.330925
4 046e85c7cc7f 0.669075 0.330925